You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For some datasets (typically modeling physical properties) one knows that some montone constrain can be applied between a feature and the prediction which can help bring down noise and ensure meaningfull relative predictions.
In the point-prediction-world this can be done with a model like HistGradientBoostingRegressor using the monotonic_cst setting (see link).
When modeling using a parameterized distribution ( like ngboost does) one would probably only want to apply this constrain to some of the parameters in the distribution i.e. to the loc of a Normal and let the scale be unconstrained. How would one go about using base learners with different setting for different parameters?
The text was updated successfully, but these errors were encountered:
Oh that's an interesting idea. Right now I don't think it can be done but it wouldn't be very hard to modify the code to allow it. Tbh something chatGPT could probably tackle! Feel free to put in a PR.
@alejandroschuler just for my understading then: the distribution parameters do not need to share a base learner - they just do right now because there was no use case for them to be different?
For some datasets (typically modeling physical properties) one knows that some montone constrain can be applied between a feature and the prediction which can help bring down noise and ensure meaningfull relative predictions.
In the point-prediction-world this can be done with a model like
HistGradientBoostingRegressor
using themonotonic_cst
setting (see link).When modeling using a parameterized distribution ( like
ngboost
does) one would probably only want to apply this constrain to some of the parameters in the distribution i.e. to theloc
of aNormal
and let thescale
be unconstrained. How would one go about using base learners with different setting for different parameters?The text was updated successfully, but these errors were encountered: