-
-
Notifications
You must be signed in to change notification settings - Fork 26k
[MRG+1] Raise warning in scikit-learn/sklearn/linear_model/cd_fast.pyx for cases when the main loop exits without reaching the desired tolerance #11754
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
5d273ad
711733b
9f9b6ce
ce020f1
95c6726
9718eb5
4cfe7c5
dc7e3ee
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -828,3 +828,20 @@ def test_warm_start_multitask_lasso(): | |
clf2 = MultiTaskLasso(alpha=0.1, max_iter=10) | ||
ignore_warnings(clf2.fit)(X, Y) | ||
assert_array_almost_equal(clf2.coef_, clf.coef_) | ||
|
||
|
||
@pytest.mark.parametrize('klass, n_classes, kwargs', | ||
[(Lasso, 1, dict(precompute=True)), | ||
(Lasso, 1, dict(precompute=False)), | ||
(MultiTaskLasso, 2, dict()), | ||
(MultiTaskLasso, 2, dict())]) | ||
def test_enet_coordinate_descent(klass, n_classes, kwargs): | ||
"""Test that a warning is issued if model does not converge""" | ||
clf = klass(max_iter=2, **kwargs) | ||
n_samples = 5 | ||
n_features = 2 | ||
X = np.ones((n_samples, n_features)) * 1e50 | ||
y = np.ones((n_samples, n_classes)) | ||
if klass == Lasso: | ||
y = y.ravel() | ||
assert_warns(ConvergenceWarning, clf.fit, X, y) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. to test this use tiny data and set max_iter to a very tiny number. it will make testing faster. besides this is already done in the estimator eg: why is it not enough for you? bug? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. For the cases identified in #10813, the estimator raises no warning. Essentially, there are numerical issues causing the duality gap and tolerance to be equal to zero - as such the warning won't be raised in the estimator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
after this warning message was printed, the for-loop goes on if
max_iter
is not reached, right?And if
max_iter
is reached before the condition in 767 happens then it won't converge but never warn?