ENH: add GWGradientBoostingRegressor#57
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #57 +/- ##
==========================================
+ Coverage 86.51% 91.34% +4.82%
==========================================
Files 6 6
Lines 786 797 +11
==========================================
+ Hits 680 728 +48
+ Misses 106 69 -37 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
You should fix the docstring and try to avoid that blatant local overfitting there :). Also, needs to be added to API reference. |
|
yeah I was looking at exactly that right now..😅 |
|
hey @martinfleis so i tested it out on local - it highly overfits on And also, for the example on Random Forest Regressor - the values are also overfit and all of them give ~0.8 since it For now, I've added the actual values in this commit and testing if CI works. I'll change immediately if you say so. |
|
We don't have to show local r2 but can show pred_ instead. In any case, none of the ensemble models should include focal by default. I did set it to False for ensemble classifiers but forgot in RF regressor. Can you update that as well? |
|
@martinfleis shall i make the pred_ changes inside the ensemble.ipynb file too and run again? |
|
No need to change anything apart from that sentence about OOB and "fails" I mentioned above. You can execute it again, yes but it gets executed automatically when building the docs anyway. |
|
@martinfleis yes, changed. |
Added Gradient Boosting Regressor - implemented the algorithm, it's test and also added in ipynb notebook.
For issue: #48