Thanks to visit codestin.com
Credit goes to github.com

Skip to content

y_min y_max resolved #10981

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 33 commits into from
Closed

y_min y_max resolved #10981

wants to merge 33 commits into from

Conversation

aishgrt1
Copy link
Contributor

Reference Issues/PRs

Fix #10903

What does this implement/fix? Explain your changes.

y_min set to 0 y_max set to 1

Any other comments?

@agramfort
Copy link
Member

you need to add a test

@aishgrt1
Copy link
Contributor Author

Is there any place I can find the documentation for reference?

@agramfort
Copy link
Member

see file test_isotonic.py

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get why we're clipping all isotonic regression output to [0, 1]. I think this clipping shoud happen in calibration.py to solve #10903

@aishgrt1
Copy link
Contributor Author

aishgrt1 commented Apr 16, 2018

I am working on all test cases, will update soon

@agramfort
Copy link
Member

I agree with @jnothman isotonic is not just for calibrating probabilitiues

@aishgrt1
Copy link
Contributor Author

The update has been made on calibration, not isotonic.py @agramfort

@@ -390,6 +390,8 @@ def predict_proba(self, X):

# Deal with cases where the predicted probability minimally exceeds 1.0
proba[(1.0 < proba) & (proba <= 1.0 + 1e-5)] = 1.0
proba[proba > 1.0] = 1.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are you not using np.clip?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And isn't the previous line now redundant?

Copy link
Contributor Author

@aishgrt1 aishgrt1 Apr 16, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The python test seems to be failing with maximum calls exceeded

@jnothman
Copy link
Member

Please add an entry to the change log at doc/whats_new/v0.20.rst. Like the other entries there, please reference this pull request with :issue: and credit yourself (and other contributors if applicable) with :user:

@@ -141,6 +141,15 @@ Classifiers and regressors
- :class:`dummy.DummyClassifier` and :class:`dummy.DummyRegresssor` now
only require X to be an object with finite length or shape.
:issue:`9832` by :user:`Vrishank Bhardwaj <vrishank97>`.

- :class:`neighbors.RadiusNeighborsRegressor` and
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first seems like a rebase issue.

# Deal with cases where the predicted probability minimally exceeds 1.0
proba[(1.0 < proba) & (proba <= 1.0 + 1e-5)] = 1.0
proba[proba > 1.0] = 1.0
proba[proba < 0.0] = 0.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure why that line was checking for the eps here. Can you check who wrote that and ping them?

@@ -239,6 +239,20 @@ def test_sigmoid_calibration():
np.vstack((exF, exF)), exY)


def test_isotonic_calibration():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should provide a name relating to what it's testing, i.e. infinite probabilities. Are you sure that case actually happens with the test case you provided?

@aishgrt1
Copy link
Contributor Author

aishgrt1 commented May 7, 2018

I have added the data set for which the predicted probabilities are "inf", until the commit for the data set is approved, the test for the test_isotonic_calibration_prob_not_inf will not pass.

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't come up with an example apart from that one?

parallelized according to ``n_jobs`` regardless of ``algorithm``.
:issue:`8003` by :user:`Joël Billaud <recamshak>`.

- Clipped the values of proba between [0,1] in
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The repetition of [0,1] is confusing

@glemaitre
Copy link
Member

@aishgrt1 We do not include data in the repo. We should find a minimal example which should trigger the issue. Could you authenticate a small dataset to reproduce the error.

@aishgrt1
Copy link
Contributor Author

aishgrt1 commented May 8, 2018

I would try to create a similar dataset with "inf" error !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Stalled Superseded PR has been replace by a newer PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

CalibratedClassifierCV with mode = 'isotonic' has predict_proba return infinite probabilities
6 participants