-
-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
Milestone
Description
It should be possible to run nosetests -s sklearn doc
and not get anything on the output besides the progress dots.
For instance currently we get:
Doctest: model_evaluation.rst ... /Users/ogrisel/code/scikit-learn/sklearn/metrics/metrics.py:1856: UserWarning: The sum of true positives and false positives are equal to zero for some labels. Precision is ill defined for those labels [1]. The precision and recall are equal to zero for some labels. fbeta_score is ill defined for those labels [1].
average=None)
/Users/ogrisel/code/scikit-learn/sklearn/metrics/metrics.py:1685: UserWarning: The precision and recall are equal to zero for some labels. fbeta_score is ill defined for those labels [1 2].
average=average)
/Users/ogrisel/code/scikit-learn/sklearn/metrics/metrics.py:1200: UserWarning: The precision and recall are equal to zero for some labels. fbeta_score is ill defined for those labels [1 2].
average=average)
/Users/ogrisel/code/scikit-learn/doc/modules/model_evaluation.rst:1: UserWarning: The precision and recall are equal to zero for some labels. fbeta_score is ill defined for those labels [1 2].
.. currentmodule:: sklearn
Those specific warnings were discussed here.
My personal opinion is that:
- all expected warnings should be individually catched and checked in tests with the
with warnings.catch_warnings(record=True) as w:
idiom. - all un-expected warnings should by fixing the root problem
- we should never print anything to stdout / stderr in the tests