Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Suggestion: Remove prediction from plot_confusion_matrix and just pass predicted labels #15880

Closed
@jhennrich

Description

@jhennrich

The signature of plot_confusion_matrix is currently:

sklearn.metrics.plot_confusion_matrix(estimator, X, y_true, labels=None, sample_weight=None, normalize=None, display_labels=None, include_values=True, xticks_rotation='horizontal', values_format=None, cmap='viridis', ax=None)

The function takes an estimator and raw data and can not be used with already predicted labels. This has some downsides:

  • If a confusion matrix should be plotted but the predictions should also be used elsewhere (e.g. calculating accuracy_score) the estimation has to be performed several times. That takes longer and can result in different values if the estimator is randomized.
  • If no estimator is available (e.g. predictions loaded from a file) the plot can not be used at all.

Suggestion: allow passing predicted labels y_pred to plot_confusion_matrix that will be used instead of estimator and X. In my opinion the cleanest solution would be to remove the prediction step from the function and use a signature similar to that of accuracy_score, e.g. (y_true, y_pred, labels=None, sample_weight=None, ...). However in order to maintain backwards compatibility, y_pred can be added as an optional keyword argument.

TODO:

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions