From c57617f3edd625d03172bacc4fb55715e33180fd Mon Sep 17 00:00:00 2001 From: Minghui Liu Date: Mon, 26 Jun 2017 23:02:45 -0700 Subject: [PATCH 1/2] Update list of scorers --- doc/modules/model_evaluation.rst | 51 ++++++++++++++++++-------------- 1 file changed, 29 insertions(+), 22 deletions(-) diff --git a/doc/modules/model_evaluation.rst b/doc/modules/model_evaluation.rst index 078d106785e94..0f25179cf3ccb 100644 --- a/doc/modules/model_evaluation.rst +++ b/doc/modules/model_evaluation.rst @@ -54,33 +54,40 @@ the model and the data, like :func:`metrics.mean_squared_error`, are available as neg_mean_squared_error which return the negated value of the metric. - -============================ ========================================= ================================== -Scoring Function Comment -============================ ========================================= ================================== +============================== ============================================= ================================== +Scoring Function Comment +============================== ============================================= ================================== **Classification** -'accuracy' :func:`metrics.accuracy_score` -'average_precision' :func:`metrics.average_precision_score` -'f1' :func:`metrics.f1_score` for binary targets -'f1_micro' :func:`metrics.f1_score` micro-averaged -'f1_macro' :func:`metrics.f1_score` macro-averaged -'f1_weighted' :func:`metrics.f1_score` weighted average -'f1_samples' :func:`metrics.f1_score` by multilabel sample -'neg_log_loss' :func:`metrics.log_loss` requires ``predict_proba`` support -'precision' etc. :func:`metrics.precision_score` suffixes apply as with 'f1' -'recall' etc. :func:`metrics.recall_score` suffixes apply as with 'f1' -'roc_auc' :func:`metrics.roc_auc_score` +'accuracy' :func:`metrics.accuracy_score` +'average_precision' :func:`metrics.average_precision_score` +'f1' :func:`metrics.f1_score` for binary targets +'f1_micro' :func:`metrics.f1_score` micro-averaged +'f1_macro' :func:`metrics.f1_score` macro-averaged +'f1_weighted' :func:`metrics.f1_score` weighted average +'f1_samples' :func:`metrics.f1_score` by multilabel sample +'neg_log_loss' :func:`metrics.log_loss` requires ``predict_proba`` support +'precision' etc. :func:`metrics.precision_score` suffixes apply as with 'f1' +'recall' etc. :func:`metrics.recall_score` suffixes apply as with 'f1' +'roc_auc' :func:`metrics.roc_auc_score` **Clustering** -'adjusted_rand_score' :func:`metrics.adjusted_rand_score` +'adjusted_rand_score' :func:`metrics.adjusted_rand_score` +'homogeneity_score' :func:`metrics.homogeneity_score` +'completeness_score' :func:`metrics.completeness_score` +'v_measure_score' :func:`metrics.v_measure_score` +'mutual_info_score' :func:`metrics.mutual_info_score` +'adjusted_mutual_info_score' :func:`metrics.adjusted_mutual_info_score` +'normalized_mutual_info_score' :func:`metrics.normalized_mutual_info_score` +'fowlkes_mallows_score' :func:`metrics.fowlkes_mallows_score` **Regression** -'neg_mean_absolute_error' :func:`metrics.mean_absolute_error` -'neg_mean_squared_error' :func:`metrics.mean_squared_error` -'neg_mean_squared_log_error' :func:`metrics.mean_squared_log_error` -'neg_median_absolute_error' :func:`metrics.median_absolute_error` -'r2' :func:`metrics.r2_score` -============================ ========================================= ================================== +'neg_mean_absolute_error' :func:`metrics.mean_absolute_error` +'neg_mean_squared_error' :func:`metrics.mean_squared_error` +'neg_mean_squared_log_error' :func:`metrics.mean_squared_log_error` +'neg_median_absolute_error' :func:`metrics.median_absolute_error` +'r2' :func:`metrics.r2_score` +============================== ============================================= ================================== + Usage examples: From 7f5cbf789ec414f0ea0bfc984600b33bc75b8d07 Mon Sep 17 00:00:00 2001 From: Minghui Liu Date: Tue, 27 Jun 2017 15:41:36 -0700 Subject: [PATCH 2/2] change ordering of scorers to alphabetical --- doc/modules/model_evaluation.rst | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/doc/modules/model_evaluation.rst b/doc/modules/model_evaluation.rst index 0f25179cf3ccb..d010256e94345 100644 --- a/doc/modules/model_evaluation.rst +++ b/doc/modules/model_evaluation.rst @@ -71,14 +71,14 @@ Scoring Function 'roc_auc' :func:`metrics.roc_auc_score` **Clustering** +'adjusted_mutual_info_score' :func:`metrics.adjusted_mutual_info_score` 'adjusted_rand_score' :func:`metrics.adjusted_rand_score` -'homogeneity_score' :func:`metrics.homogeneity_score` 'completeness_score' :func:`metrics.completeness_score` -'v_measure_score' :func:`metrics.v_measure_score` +'fowlkes_mallows_score' :func:`metrics.fowlkes_mallows_score` +'homogeneity_score' :func:`metrics.homogeneity_score` 'mutual_info_score' :func:`metrics.mutual_info_score` -'adjusted_mutual_info_score' :func:`metrics.adjusted_mutual_info_score` 'normalized_mutual_info_score' :func:`metrics.normalized_mutual_info_score` -'fowlkes_mallows_score' :func:`metrics.fowlkes_mallows_score` +'v_measure_score' :func:`metrics.v_measure_score` **Regression** 'neg_mean_absolute_error' :func:`metrics.mean_absolute_error`