Thanks to visit codestin.com
Credit goes to github.com

Skip to content

DOC Add quantile loss to user guide on HGBT regression #29063

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jul 23, 2024

Conversation

ArturoAmorQ
Copy link
Member

Reference Issues/PRs

Follows #28652.

What does this implement/fix? Explain your changes.

As mentioned in #28652 (comment), the description of HGBT regression losses was missing the quantile loss. This PR fixes the issue.

Any other comments?

In the same comment it is mentioned that

[...] some overhaul and entanglement of the different GBTs would be very good. It‘s hard to speak about newton boosting (HGBT), when the original GBT is not yet introduced.

For a follow-up PR maybe we can consider refactoring the narrative to first introduce common mathematical aspects of GBT models and then the HGBT version. WDYT?

Copy link

github-actions bot commented May 21, 2024

✔️ Linting Passed

All linting checks passed. Your pull request is in excellent shape! ☀️

Generated for commit: 442a0a1. Link to the linter CI: here

Co-authored-by: Guillaume Lemaitre <[email protected]>
Copy link
Member

@ogrisel ogrisel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some feedback. I noticed that GradientBoostingRegressor has the "huber" loss which HistGradientBoostingRegressor does not provide and on the contrary, GradientBoostingRegressor is lacking "poisson" and "gamma".

The latter is kind of related to #16668 but I don't think we have an issue to add the huber loss to HistGradientBoostingRegressor (although I am not sure how useful it would be in practice because it does not really have any meaningful probablistic interpretation as far as I know.

@glemaitre glemaitre self-requested a review July 9, 2024 09:03
@glemaitre
Copy link
Member

Since the different comments have been addressed, I'm going to merge. Thanks @ArturoAmorQ

@glemaitre glemaitre merged commit 843b842 into scikit-learn:main Jul 23, 2024
30 checks passed
@ArturoAmorQ ArturoAmorQ deleted the hgbt_qloss branch July 23, 2024 16:10
MarcBresson pushed a commit to MarcBresson/scikit-learn that referenced this pull request Sep 2, 2024
…29063)

Co-authored-by: ArturoAmorQ <[email protected]>
Co-authored-by: Guillaume Lemaitre <[email protected]>
Co-authored-by: Olivier Grisel <[email protected]>
glemaitre added a commit to glemaitre/scikit-learn that referenced this pull request Sep 9, 2024
…29063)

Co-authored-by: ArturoAmorQ <[email protected]>
Co-authored-by: Guillaume Lemaitre <[email protected]>
Co-authored-by: Olivier Grisel <[email protected]>
glemaitre added a commit that referenced this pull request Sep 11, 2024
Co-authored-by: ArturoAmorQ <[email protected]>
Co-authored-by: Guillaume Lemaitre <[email protected]>
Co-authored-by: Olivier Grisel <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants