You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can use N-fold cross-validation to do hacky regularization. sklearn has a lot of nice tools for thus. More or less, you can use a GridSearchCV object to do this. How it works is that you split the data into N sections. You loop and leave one of them out, fit a model, then predict for the other one. At the very end, you combine all of the out-of-sample predictions.
With this technique, we can loop through a range of regularization amplitudes, run CV for each of them, and pick the one that has the minimum chi2 or w/e.
It fits a lot more models, but would do the trick.
The text was updated successfully, but these errors were encountered:
We can use N-fold cross-validation to do hacky regularization. sklearn has a lot of nice tools for thus. More or less, you can use a GridSearchCV object to do this. How it works is that you split the data into N sections. You loop and leave one of them out, fit a model, then predict for the other one. At the very end, you combine all of the out-of-sample predictions.
With this technique, we can loop through a range of regularization amplitudes, run CV for each of them, and pick the one that has the minimum chi2 or w/e.
It fits a lot more models, but would do the trick.
The text was updated successfully, but these errors were encountered: