Description
Description
The fit
method of OneVsRestClassifier
and other MetaEstimators (e.g. OneVsOneClassifier
) only accepts X
and y
as parameters. Underlying estimators like SGDClassifier
accept more optional keyword args to this function which are essential for some tasks (e.g. the sample_weight
parameter is the only way to add weights to training samples in multi-label classification problems).
Steps/Code to Reproduce
Here's how I solve a multi-label classification task with a linear SVM
from sklearn.preprocessing import MultiLabelBinarizer
from sklearn.multiclass import OneVsRestClassifier
from sklearn.linear_model import SGDClassifier
X = [[0.0, 0.0], [1.0, 1.0], [1.0, 0.0]]
y = [[0], [1], [0, 1]]
y_mlb = MultiLabelBinarizer().fit_transform(y)
sample_weight = [1.0, 0.5, 0.8]
clf = OneVsRestClassifier(SGDClassifier(loss="hinge"))
clf.fit(X, y_mlb) # unable to pass `sample_weight`
see also this related question on stackoverflow.
Expected Results
For regular (single-label) classification tasks, I can pass the sample_weight
kwarg directly to SGDClassifier.fit
, but with OneVsRestClassifier
this is not possible
Actual Results
I cannot add weights to my training samples when OneVsRestClassifier
(or a similar meta-estimator) is used.
Feature Request
Please let the fit
method of OneVsRestClassifier
(and similar meta-estimators) accept arbitrary kwargs and pass them on to the fit
method of the wrapped estimator. The same may be useful for partial_fit
and score
, though I'm not sure about that.