Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 201cfde

Browse files
Charlie-XIAOglemaitre
authored andcommitted
DOC GradientBoosting* will not implement monotonic constraints, use HistGradientBoosting* instead (#27516)
1 parent 0c7073d commit 201cfde

File tree

1 file changed

+6
-4
lines changed

1 file changed

+6
-4
lines changed

sklearn/ensemble/_gb.py

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1124,8 +1124,9 @@ class GradientBoostingClassifier(ClassifierMixin, BaseGradientBoosting):
11241124
classification is a special case where only a single regression tree is
11251125
induced.
11261126
1127-
:class:`sklearn.ensemble.HistGradientBoostingClassifier` is a much faster
1128-
variant of this algorithm for intermediate datasets (`n_samples >= 10_000`).
1127+
:class:`~sklearn.ensemble.HistGradientBoostingClassifier` is a much faster variant
1128+
of this algorithm for intermediate and large datasets (`n_samples >= 10_000`) and
1129+
supports monotonic constraints.
11291130
11301131
Read more in the :ref:`User Guide <gradient_boosting>`.
11311132
@@ -1727,8 +1728,9 @@ class GradientBoostingRegressor(RegressorMixin, BaseGradientBoosting):
17271728
each stage a regression tree is fit on the negative gradient of the given
17281729
loss function.
17291730
1730-
:class:`sklearn.ensemble.HistGradientBoostingRegressor` is a much faster
1731-
variant of this algorithm for intermediate datasets (`n_samples >= 10_000`).
1731+
:class:`~sklearn.ensemble.HistGradientBoostingRegressor` is a much faster variant
1732+
of this algorithm for intermediate and large datasets (`n_samples >= 10_000`) and
1733+
supports monotonic constraints.
17321734
17331735
Read more in the :ref:`User Guide <gradient_boosting>`.
17341736

0 commit comments

Comments
 (0)