-
-
Notifications
You must be signed in to change notification settings - Fork 26.6k
[MRG] Fix warm_start behavior in multilayer perceptron #12605
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
jnothman
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did my arithmetic wrong. As per the contributor docs, we need to support the old form for two releases, so should break in 0.23, not 0.22.
This is looking pretty good!
|
I fixed those items. BTW can I get confirmation that this is correct: since it will be deprecated as of the 0.21 release? |
|
Yes, I think that noticed is right. |
jnothman
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please also add an entry to the change log at doc/whats_new/v0.21.rst. Like the other entries there, please reference this pull request with :issue: and credit yourself (and other contributors if applicable) with :user:
|
Alright those changes should be in now. I listed it as an API change in whats_new. |
eamanu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
The merge conflicts should be fixed now, might need to @ somebody again to get a second reviewer. Edit: Taking the liberty of pinging @rth since you reviewed my other PR related to MLP. |
|
@jnothman Any idea on getting a second reviewer for this? |
|
Thanks for this PR! So to avoid warnings, the user would need to switch from Can't we just consider it a bug fix and change the behavior of Not asking to make these changes in this PR. Let's get a second opinion first. @samwaterbury If you could resolve merge conflicts that would be useful. Thanks! |
|
Ok! Merge conflicts should be fixed. That force push was just a rebase. I can go either way, this is a breaking change but it's also definitely a bug fix. Just let me know. |
Addresses #12505 (continued)
This PR addresses the unexpected behavior of
warm_startin the multilayer perceptron. To recap, the current behavior is that warm_start breaks after a single iteration. I took the first steps of changing this by:warm_start='full'to represent the future behaviorwarm_start=Truethat it will be changed in 0.22There are two FIXMEs in the file that indicate the very simple changes required to make the transition in 0.22 (just remove the warning and remove an if statement condition). I tested making those 0.22 changes and it works as intended. Lastly, I added 2 tests: one for the FutureWarning and one for the new
warm_start='full'option.Please let me know also if I got the futurewarning/deprecation stuff right.
@jnothman tagging you again
Edit: The test that's failing is because the FutureWarning I added is getting raised.
Edit 2: fixed tests