@@ -268,27 +268,6 @@ and with a list of labels format:
268
268
for an example of accuracy score usage using permutations of
269
269
the dataset.
270
270
271
- Area under the ROC curve
272
- .........................
273
-
274
- The :func: `roc_auc_score ` function computes the area under the receiver
275
- operating characteristic (ROC) curve.
276
-
277
- This function requires the true binary value and the target scores, which can
278
- either be probability estimates of the positive class, confidence values, or
279
- binary decisions.
280
-
281
- >>> import numpy as np
282
- >>> from sklearn.metrics import roc_auc_score
283
- >>> y_true = np.array([0 , 0 , 1 , 1 ])
284
- >>> y_scores = np.array([0.1 , 0.4 , 0.35 , 0.8 ])
285
- >>> roc_auc_score(y_true, y_scores)
286
- 0.75
287
-
288
- For more information see the
289
- `Wikipedia article on AUC
290
- <http://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_curve> `_
291
- and the :ref: `roc_metrics ` section.
292
271
293
272
.. _average_precision_metrics :
294
273
@@ -713,7 +692,7 @@ with a svm classifier::
713
692
714
693
715
694
Log loss
716
- --------
695
+ ........
717
696
The log loss, also called logistic regression loss or cross-entropy loss,
718
697
is a loss function defined on probability estimates.
719
698
It is commonly used in (multinomial) logistic regression and neural networks,
@@ -795,7 +774,7 @@ function:
795
774
.. _roc_metrics :
796
775
797
776
Receiver operating characteristic (ROC)
798
- ........................................
777
+ .......................................
799
778
800
779
The function :func: `roc_curve ` computes the `receiver operating characteristic
801
780
curve, or ROC curve (quoting
@@ -809,6 +788,9 @@ Wikipedia) <http://en.wikipedia.org/wiki/Receiver_operating_characteristic>`_:
809
788
positive rate), at various threshold settings. TPR is also known as
810
789
sensitivity, and FPR is one minus the specificity or true negative rate."
811
790
791
+ This function requires the true binary
792
+ value and the target scores, which can either be probability estimates of the
793
+ positive class, confidence values, or binary decisions.
812
794
Here a small example of how to use the :func: `roc_curve ` function::
813
795
814
796
>>> import numpy as np
@@ -823,6 +805,19 @@ Here a small example of how to use the :func:`roc_curve` function::
823
805
>>> thresholds
824
806
array([ 0.8 , 0.4 , 0.35, 0.1 ])
825
807
808
+ The :func: `roc_auc_score ` function computes the area under the receiver
809
+ operating characteristic (ROC) curve, which is also denoted by
810
+ AUC or AUROC. By computing the
811
+ area under the roc curve, the curve information is summarized in one number.
812
+ For more information see the `Wikipedia article on AUC
813
+ <http://en.wikipedia.org/wiki/Receiver_operating_characteristic#Area_under_curve> `_.
814
+
815
+ >>> import numpy as np
816
+ >>> from sklearn.metrics import roc_auc_score
817
+ >>> y_true = np.array([0 , 0 , 1 , 1 ])
818
+ >>> y_scores = np.array([0.1 , 0.4 , 0.35 , 0.8 ])
819
+ >>> roc_auc_score(y_true, y_scores)
820
+ 0.75
826
821
827
822
The following figure shows an example of such ROC curve.
828
823
0 commit comments