Description
Currently sklearn.metrics.ranking._binary_clf_curve
is (the way I understand the underscore) an internal API method.
Whenever there is a need to work with a different tradeoff than precision / recall or roc or when you need custom metrics for all thresholds, this method is a perfect fit, and the underscore in front of it makes me wonder if I can be confident it will not change in future versions :-)
I need to compute for instance (FP+TN)/(TN+FN+FP) at different thresholds and other use cases could be e.g. these)
I think making this method part of the public API would be beneficial for the community.
p.s.
Tensorflow used to have e.g. tensorflow.contrib.metrics.python.ops.metric_ops.precision_recall_at_equal_thresholds
now they have https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/metrics_impl.py#L1792 etc.