From 714e6bdb18cfa61d0df6ee8bb3fa02d50a3b3d58 Mon Sep 17 00:00:00 2001 From: Christian Lorentzen Date: Tue, 25 Oct 2022 16:46:03 +0200 Subject: [PATCH] DOC fix deprecated log loss argument in user guide --- doc/glossary.rst | 2 +- doc/modules/linear_model.rst | 8 ++++---- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/doc/glossary.rst b/doc/glossary.rst index 9461363cf836e..07f844619cc54 100644 --- a/doc/glossary.rst +++ b/doc/glossary.rst @@ -284,7 +284,7 @@ General Concepts >>> from sklearn.model_selection import GridSearchCV >>> from sklearn.linear_model import SGDClassifier >>> clf = GridSearchCV(SGDClassifier(), - ... param_grid={'loss': ['log', 'hinge']}) + ... param_grid={'loss': ['log_loss', 'hinge']}) This means that we can only check for duck-typed attributes after fitting, and that we must be careful to make :term:`meta-estimators` diff --git a/doc/modules/linear_model.rst b/doc/modules/linear_model.rst index 492f4ed78fac9..0177165b2bb07 100644 --- a/doc/modules/linear_model.rst +++ b/doc/modules/linear_model.rst @@ -126,9 +126,9 @@ its ``coef_`` member:: >>> reg.intercept_ 0.13636... -Note that the class :class:`Ridge` allows for the user to specify that the -solver be automatically chosen by setting `solver="auto"`. When this option -is specified, :class:`Ridge` will choose between the `"lbfgs"`, `"cholesky"`, +Note that the class :class:`Ridge` allows for the user to specify that the +solver be automatically chosen by setting `solver="auto"`. When this option +is specified, :class:`Ridge` will choose between the `"lbfgs"`, `"cholesky"`, and `"sparse_cg"` solvers. :class:`Ridge` will begin checking the conditions shown in the following table from top to bottom. If the condition is true, the corresponding solver is chosen. @@ -1020,7 +1020,7 @@ The following table summarizes the penalties supported by each solver: The "lbfgs" solver is used by default for its robustness. For large datasets the "saga" solver is usually faster. For large dataset, you may also consider using :class:`SGDClassifier` -with 'log' loss, which might be even faster but requires more tuning. +with `loss="log_loss"`, which might be even faster but requires more tuning. .. topic:: Examples: