Thanks to visit codestin.com
Credit goes to github.com

Skip to content

sklearn.metrics.tests.test_common.test_sample_weight_invariance not wrapped in named_test #8507

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
amueller opened this issue Mar 3, 2017 · 11 comments
Labels
Easy Well-defined and straightforward way to resolve

Comments

@amueller
Copy link
Member

amueller commented Mar 3, 2017

sklearn.metrics.tests.test_common.test_sample_weight_invariance shows an ugly name when running tests as it was not properly wrapped.

@amueller amueller added Easy Well-defined and straightforward way to resolve Need Contributor Sprint labels Mar 3, 2017
@kahnchana
Copy link

I'm trying this. Do you have the test logs?

@amueller
Copy link
Member Author

amueller commented Mar 4, 2017

you can just run the test locally to see it. Or check out travis.

@nikitasingh981
Copy link
Contributor

This is also an issue for test_averaging_multiclass, test_averaging_multilabel, test_averaging_multilabel_all_zeroes, and test_averaging_multilabel_all_ones.

@kahnchana
Copy link

Yes. It comes for a lot of them. I will try to see what's wrong with them all.

@lesteve
Copy link
Member

lesteve commented Mar 9, 2017

Actually thinking about this, I am not sure I understand what this issue is about ... it seems like this convoluted naming is normal for tests that use yield.

Can you give an example of an ugly name and a non-ugly name (both for test functions that use yield)?

@kahnchana
Copy link

I was assuming it referred to the " <generator object test_averaging_multilabel at 0x000000000B0B5438>" that was output when running tests, because the test doesn't run through the iterations but just calls the function.

@lesteve
Copy link
Member

lesteve commented Mar 13, 2017

Hmmm I don't see anything that looks like this when running the tests. What's the command you are running to get this output? Do you see them in the Travis or AppVeyor log?

@gxyd
Copy link
Contributor

gxyd commented Oct 23, 2017

I'll try to fix the issue then.

@lesteve
Copy link
Member

lesteve commented Oct 23, 2017

I think (although not 100% sure) that this issue is specific to nose. We are going to switch to pytest in the not so far future (probably a few weeks time I would say). If this is nose-specific this is not really worth investigating.

As per #8507 (comment), I did not understand what this issue was about last time I looked.

@gxyd
Copy link
Contributor

gxyd commented Oct 23, 2017

Initially I was a little sceptical to choose the issue, since I am looking for easy to fix core sklearn issue. I'll look for some other issue. I'll leave this.

@lesteve
Copy link
Member

lesteve commented Nov 17, 2017

Closing because I am reasonably confident this is specific to nose and we have now fully moved to pytest.

@lesteve lesteve closed this as completed Nov 17, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Easy Well-defined and straightforward way to resolve
Projects
None yet
Development

No branches or pull requests

5 participants