-
-
Notifications
You must be signed in to change notification settings - Fork 26.6k
[MRG + 2] Add convergence warning to linear models #10881
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG + 2] Add convergence warning to linear models #10881
Conversation
3b2bae5 to
cfee9c1
Compare
| n_iter=2, iid=False)] | ||
| estimators = [ | ||
| GridSearchCV( | ||
| LinearSVC(random_state=0, max_iter=100000), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would rather leave max_iter to its default. Can you try using @ignore_warnings(category=ConvergenceWarning)) on test_return_train_score_warn or something like assert_no_warnings(ignore_warnings(estimator.fit, category=ConvergenceWarning)(...))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank's for the review. It's done
| # srand supports | ||
| n_iter_ = max(n_iter_) | ||
| if n_iter_ >= max_iter and verbose > 0: | ||
| if n_iter_ >= max_iter: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be great if you could estimate how much code is affected by this change. The best thing to do would be to add tests for the relevant estimators that they get a ConvergenceWarning.
jnothman
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, please add a similar test for LinearSVC
be9a6b3 to
23ecd24
Compare
|
I added a test for LinearSVC and LinearSVR. They both use |
sklearn/svm/tests/test_svm.py
Outdated
|
|
||
| def test_timeout(): | ||
| a = svm.SVC(kernel=lambda x, y: np.dot(x, y.T), probability=True, | ||
| svc = svm.SVC(kernel=lambda x, y: np.dot(x, y.T), probability=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general try to avoid changing something which is unrelated to the PR, it adds noise and makes reviewing less pleasant.
sklearn/svm/tests/test_svm.py
Outdated
|
|
||
|
|
||
| def test_convergence_warning_linear_svm(): | ||
| # Test if a convergence warning is raised when verbose = 0. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow this is a very thorough test! I think this may be slightly overkill though, I would be happy with just checking the ConvergenceWarning with the default values of LinearSVC and LinearSVR.
| estimators = [GridSearchCV(LinearSVC(random_state=0), grid, iid=False), | ||
| RandomizedSearchCV(LinearSVC(random_state=0), grid, | ||
| n_iter=2, iid=False)] | ||
| estimators = [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As mentioned somewhere else, try to avoid changing unrelated things, especially if it is only cosmetic. Can you revert this change?
23ecd24 to
5d10ba5
Compare
For lbfgs and liblinears solvers, the convergence warnings appeared only when verbose was greater than 0, whereas they appeared with verbose = 0 with other solvers. Create test to check the convergence warning in logistic regression and in linear svm. Update `test_search` to ignore this convergence warning. Fixes scikit-learn#10866
5d10ba5 to
dcec4ba
Compare
|
I rebased the different commits to have one clean commit which does not touch unrelated lines. |
No worries, thanks for reverting the unwanted changes. For next time, conventions vary between projects, but in scikit-learn prefer if you don't squash your commits, because it makes it harder to see what has changed since the last time we looked at the PR. Also we use squash and merge so that PR always end up being a single commit anyway. |
|
I pushed a minor tweak which I think makes the statement more readable. LGTM (as long as the CIs come back green). |
|
@lesteve Thank's for your help. I'll try to remember everything for the next contribution :) |
2962148 to
a59ed43
Compare
No worries that's part of the learning process :-). If you have not already |
jnothman
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
Please add an entry to the change log at |
…thub.com/AlexandreSev/scikit-learn into add_convergence_warning_to_linear_models
jnothman
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks
Reference Issues/PRs
Fixes #10866
What does this implement/fix? Explain your changes.
It makes convergence warnings appear when verbose = 0 in Logistic Regression.
It also updates a test to check that a warning is sent when it is needed.
Any other comments?
I also had to change test_return_train_score_warn in sklearn/model_selection/tests/test_search.py. The new warnings make this assertion to fail.
I am not sure that I perfectly understood this test, but I think it was not done to test convergence warnings.