Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[MRG] More deprecations for 0.23 #15860

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 34 commits into from
Jan 13, 2020

Conversation

NicolasHug
Copy link
Member

@NicolasHug NicolasHug commented Dec 11, 2019

Follow up to #15804 (need to be merged first)

This PR:

  • removes multiouput.score since it's now consistent with the default
  • removes support for a deprecated least_angle parameter combination
  • removes support for l1 or l2 loss in svm
  • removes linear_assignment.py

According to git grep 0.23 -- sklearn/ ':!*.csv', what is left is:

@adrinjalali , would you mind taking care of these, since you made the original PRs? they're a bit cryptic to me

@NicolasHug NicolasHug changed the title [WIP] More deprecations for 0.23 [MRG] More deprecations for 0.23 Dec 13, 2019
@adrinjalali
Copy link
Member

The codecov issues seem legit.

@NicolasHug
Copy link
Member Author

Fixed codecov

ping @adrinjalali @glemaitre @thomasjpfan for a review :)

@adrinjalali adrinjalali merged commit 4953ec3 into scikit-learn:master Jan 13, 2020
@adrinjalali
Copy link
Member

Thanks @NicolasHug , and sure, I'll take care of those two PRs.

@NicolasHug
Copy link
Member Author

awesome, thanks!

thomasjpfan pushed a commit to thomasjpfan/scikit-learn that referenced this pull request Feb 22, 2020
* removed warn_on_dtype

* removed parameters to check_is_fitted

* all_estimators parameters

* deprecated n_components attribute in AgglomerativeClustering

* change default of base.score for multioutput

* removed lots of useless decorators?

* changed default of copy in quantil_transform

* removed six.py

* nmf default value of init param

* raise error instead of warning in LinearDiscriminantAnalysis

* removed label param in hamming_loss

* updated method parameter of power_transform

* pep8

* changed default value of min_impurity_split

* removed assert_false and assert_true

* added and fixed versionchanged directives

* reset min_impurity_split default to None

* fixed LDA issue

* fixed some test

* more docstrings updates

* set min_impurity_decrease for test to pass

* upate docstring example

* fixed doctest

* removed multiouput.score since it's now consistent with the default

* deprecate least_angle parameter combination

* remove support for l1 or l2 loss in svm

* removed linear_assignment.py

* add test
panpiort8 pushed a commit to panpiort8/scikit-learn that referenced this pull request Mar 3, 2020
* removed warn_on_dtype

* removed parameters to check_is_fitted

* all_estimators parameters

* deprecated n_components attribute in AgglomerativeClustering

* change default of base.score for multioutput

* removed lots of useless decorators?

* changed default of copy in quantil_transform

* removed six.py

* nmf default value of init param

* raise error instead of warning in LinearDiscriminantAnalysis

* removed label param in hamming_loss

* updated method parameter of power_transform

* pep8

* changed default value of min_impurity_split

* removed assert_false and assert_true

* added and fixed versionchanged directives

* reset min_impurity_split default to None

* fixed LDA issue

* fixed some test

* more docstrings updates

* set min_impurity_decrease for test to pass

* upate docstring example

* fixed doctest

* removed multiouput.score since it's now consistent with the default

* deprecate least_angle parameter combination

* remove support for l1 or l2 loss in svm

* removed linear_assignment.py

* add test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants