You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for implementing such a cool algorithm! I am wondering whether it is possible to add a monotone constraint to the main function. This is crucial for problems such as credit scoring for which domain knowledge is important and supported by most major implementations of boosting models such as Xgboost and LightGBM.
Happy boosting!
Sincerely,
Yu Cao
The text was updated successfully, but these errors were encountered:
Thank you for bringing this up, Yu! This would for sure be helpful for credit scoring and also other areas.
Given that the solution done by XGBoost and LightGBM is relatively simple (don't allow for splits that are not in line with the constraints, as far as I remember...), this can for sure be implemented. Note that this is also being discussed for scikit-learn, see e.g. scikit-learn/scikit-learn#6656. I.e., we can likely use code / ideas from there.
I currently don't have time to work on this, but any contributions are welcome.
Hi Mr. Sigrist:
Thank you for implementing such a cool algorithm! I am wondering whether it is possible to add a monotone constraint to the main function. This is crucial for problems such as credit scoring for which domain knowledge is important and supported by most major implementations of boosting models such as Xgboost and LightGBM.
Happy boosting!
Sincerely,
Yu Cao
The text was updated successfully, but these errors were encountered: