Thanks to visit codestin.com
Credit goes to github.com

Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion sklearn/tests/test_calibration.py
Original file line number Diff line number Diff line change
Expand Up @@ -613,7 +613,7 @@ def test_calibration_votingclassifier():
# defined via a property that only works when voting="soft".
X, y = make_classification(n_samples=10, n_features=5, n_classes=2, random_state=7)
vote = VotingClassifier(
estimators=[("dummy" + str(i), DummyClassifier()) for i in range(3)],
estimators=[("lr" + str(i), LogisticRegression()) for i in range(3)],
voting="soft",
)
vote.fit(X, y)
Expand Down
2 changes: 2 additions & 0 deletions sklearn/tests/test_dummy.py
Original file line number Diff line number Diff line change
Expand Up @@ -726,6 +726,8 @@ def test_dtype_of_classifier_probas(strategy):
assert probas.dtype == np.float64


# TODO: remove in 1.2
@pytest.mark.filterwarnings("ignore:`n_features_in_` is deprecated")
@pytest.mark.parametrize("Dummy", (DummyRegressor, DummyClassifier))
def test_n_features_in_(Dummy):
X = [[1, 2]]
Expand Down
1 change: 1 addition & 0 deletions sklearn/tests/test_multioutput.py
Original file line number Diff line number Diff line change
Expand Up @@ -615,6 +615,7 @@ def fit(self, X, y, sample_weight=None, **fit_params):
return super().fit(X, y, sample_weight)


@pytest.mark.filterwarnings("ignore:`n_features_in_` is deprecated")
Copy link
Member Author

@glemaitre glemaitre Sep 7, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one is worrying me. Here, it shows that a user will see the warning for some meta-estimators using the dummy estimator. Indeed, for the meta-estimators setting n_features_in_ depending on the inner estimator, we will check n_features_in_ and it will raise the warning with the dummy estimators.

It might be less of a big deal because it is only a dummy estimator and maybe not that many people are using it in a meta-estimator.

Copy link
Member

@thomasjpfan thomasjpfan Sep 7, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it gets annoying for users, we can define our own hasattr that ignores future warnings emitted by sklearn.

Edit: Most likely better to use a @property and @available_if. (If @available_if works with properties)

@pytest.mark.parametrize(
"estimator, dataset",
[
Expand Down