Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit ef3da88

Browse files
arjolyogrisel
authored andcommitted
DOC put the narrative documentation of roc_curve and roc_auc_score in one place
1 parent 1f0815b commit ef3da88

File tree

1 file changed

+18
-23
lines changed

1 file changed

+18
-23
lines changed

doc/modules/model_evaluation.rst

Lines changed: 18 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -268,27 +268,6 @@ and with a list of labels format:
268268
for an example of accuracy score usage using permutations of
269269
the dataset.
270270

271-
Area under the ROC curve
272-
.........................
273-
274-
The :func:`roc_auc_score` function computes the area under the receiver
275-
operating characteristic (ROC) curve.
276-
277-
This function requires the true binary value and the target scores, which can
278-
either be probability estimates of the positive class, confidence values, or
279-
binary decisions.
280-
281-
>>> import numpy as np
282-
>>> from sklearn.metrics import roc_auc_score
283-
>>> y_true = np.array([0, 0, 1, 1])
284-
>>> y_scores = np.array([0.1, 0.4, 0.35, 0.8])
285-
>>> roc_auc_score(y_true, y_scores)
286-
0.75
287-
288-
For more information see the
289-
`Wikipedia article on AUC
290-
<http://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_curve>`_
291-
and the :ref:`roc_metrics` section.
292271

293272
.. _average_precision_metrics:
294273

@@ -713,7 +692,7 @@ with a svm classifier::
713692

714693

715694
Log loss
716-
--------
695+
........
717696
The log loss, also called logistic regression loss or cross-entropy loss,
718697
is a loss function defined on probability estimates.
719698
It is commonly used in (multinomial) logistic regression and neural networks,
@@ -795,7 +774,7 @@ function:
795774
.. _roc_metrics:
796775

797776
Receiver operating characteristic (ROC)
798-
........................................
777+
.......................................
799778

800779
The function :func:`roc_curve` computes the `receiver operating characteristic
801780
curve, or ROC curve (quoting
@@ -809,6 +788,9 @@ Wikipedia) <http://en.wikipedia.org/wiki/Receiver_operating_characteristic>`_:
809788
positive rate), at various threshold settings. TPR is also known as
810789
sensitivity, and FPR is one minus the specificity or true negative rate."
811790

791+
This function requires the true binary
792+
value and the target scores, which can either be probability estimates of the
793+
positive class, confidence values, or binary decisions.
812794
Here a small example of how to use the :func:`roc_curve` function::
813795

814796
>>> import numpy as np
@@ -823,6 +805,19 @@ Here a small example of how to use the :func:`roc_curve` function::
823805
>>> thresholds
824806
array([ 0.8 , 0.4 , 0.35, 0.1 ])
825807

808+
The :func:`roc_auc_score` function computes the area under the receiver
809+
operating characteristic (ROC) curve, which is also denoted by
810+
AUC or AUROC. By computing the
811+
area under the roc curve, the curve information is summarized in one number.
812+
For more information see the `Wikipedia article on AUC
813+
<http://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_curve>`_.
814+
815+
>>> import numpy as np
816+
>>> from sklearn.metrics import roc_auc_score
817+
>>> y_true = np.array([0, 0, 1, 1])
818+
>>> y_scores = np.array([0.1, 0.4, 0.35, 0.8])
819+
>>> roc_auc_score(y_true, y_scores)
820+
0.75
826821

827822
The following figure shows an example of such ROC curve.
828823

0 commit comments

Comments
 (0)