-
-
Notifications
You must be signed in to change notification settings - Fork 26.5k
Closed
Labels
Description
Describe the issue linked to the documentation
I am trying to leverage the classification metrics that rely on a posterior probability (i.e. P(Y | X=x)). This is commonly named y_pred_proba in the sklearn API.
However, I noticed a discrepancy in the naming of the argument for this in various metrics. For example:
- https://scikit-learn.org/stable/modules/generated/sklearn.metrics.precision_recall_curve.html#sklearn.metrics.precision_recall_curve names is
probas_pred - https://scikit-learn.org/stable/modules/generated/sklearn.metrics.roc_auc_score.html#sklearn.metrics.roc_auc_score names is
y_score - https://scikit-learn.org/stable/modules/generated/sklearn.metrics.brier_score_loss.html#sklearn.metrics.brier_score_loss is
y_prob - https://scikit-learn.org/stable/modules/generated/sklearn.metrics.top_k_accuracy_score.html#sklearn.metrics.top_k_accuracy_score is
y_score
Based on the glossary, only y_score has anything related by ctrl+f.
Suggest a potential alternative/fix
Perhaps we can name them all y_score to be consistent? E.g. the following two metrics