-
-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Fix labelprop underflow errors. #12478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix labelprop underflow errors. #12478
Conversation
VisibleDeprecationWarning from Numpy is being ignored already and numpy seems to be moving to FutureWarning as recommended in PEP scikit-learn#565.
Also add a test.
It looks like the build failures are not strictly related to my changes. |
What warnings are you silencing? |
warnings.simplefilter("always") | ||
# Trigger a warning. | ||
result = func(*args, **kw) | ||
w = [e for e in w |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, please don't change this. Rather use pytest.warns
which gives some precision not facilitated here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, I'll try to rewrite the logic using pytest.warns
. Thanks!
Ah, I see that you meant which
This fixes the |
Fixes #9313.
Supersedes #9384.
Instead of depending on the MNIST dataset in the original test, I've created a similar (random) dataset which also produced the same warnings.
Sidenote: I've had to disable
FutureWarning
in addition tonp.VisibleDeprecationWarning
inassert_no_warnings
insklearn.utils.testing
. Perhaps this is overreaching a bit, so I could implement a new testing routine which only looks for absence of certain warnings in the output (as a separate PR?)