-
-
Notifications
You must be signed in to change notification settings - Fork 26.5k
ENH Adding variable force_alpha to classes in naive_bayes.py
#22269
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Merging changes from the main repository
Merging changes from the main repository
Merging changes from the main repository
Update branch
Resolving conflicts
…ka204/scikit-learn into alpha-close-or-equal-0-update
Alpha close or equal 0 update
Update master
…a-is-close-or-equal-0' into master-copy
Update branch
…ernoulliNB-and-MultinomialNB-when-alpha-is-close-or-equal-0 # Conflicts: # doc/whats_new/v0.24.rst # sklearn/naive_bayes.py
# Conflicts: # doc/whats_new/v1.0.rst
Co-authored-by: Julien Jerphanion <[email protected]>
Co-authored-by: Julien Jerphanion <[email protected]>
# Conflicts: # sklearn/naive_bayes.py
thomasjpfan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the force_alpha=True everywhere in tests, I prefer pytestmark = pytest.filterwarnings in this case.
In many cases force_alpha=True does not change anything because the default alpha is 1.0.
This reverts commit 14a360f.
Co-authored-by: Thomas J. Fan <[email protected]>
Co-authored-by: Thomas J. Fan <[email protected]>
jjerphan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thank you, @Micky774.
Co-authored-by: Julien Jerphanion <[email protected]>
force_alpha to classes in naive_bayes.pyforce_alpha to classes in naive_bayes.py
thomasjpfan
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor nits, otherwise LGTM
Co-authored-by: Thomas J. Fan <[email protected]>
The NaiveBayes classifiers now have a force_alpha attribute, which is explained in: scikit-learn/scikit-learn#22269 This PR just gets the code working; it would be good to invest some time into reading the details of that sklearn PR and making the appropriate adjustments.
Reference Issues/PRs
Fixes #10772
Resolves #10775 (stalled)
Resolves #16747 (stalled)
Resolves #18805 (stalled)
What does this implement/fix? Explain your changes.
This PR takes over stalled PR #18805.
From the description of #16747 and #18805: "This PR adds a new variable
alphaCorrectionin classes innaive_bayes.py, which is set toTrueby default and if set toFalse, then foralpha=0(or greater, but still smaller than_ALPHA_MIN) alpha is not being rounded up to_ALPHA_MIN."This PR updated minor version details in documentation as well as began a double-deprecation cycle, initially adding a
force_alpha=Falsekeyword and begins a deprecation cycle to change its default toTrue. After completion of this default change, a new deprecation cycle will begin to removeforce_alpha.Any other comments?
Follow-up PR to begin deprecation for removal of
force_alpha