Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[MRG] Adds support for sample weight in hinge loss #3788

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from

Conversation

SaurabhJha
Copy link
Contributor

This is dependent on #3607. I will uncomment the added lines after it's merge in main tree.

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 4c0b04b on SaurabhJha:3450_hinge_loss into 535d1f6 on scikit-learn:master.

@SaurabhJha SaurabhJha changed the title Adds commented lines which will add support for sample weight in hinge loss Adds support for sample weight in hinge loss Oct 21, 2014
@SaurabhJha SaurabhJha changed the title Adds support for sample weight in hinge loss [MRG] Adds support for sample weight in hinge loss Nov 4, 2014
@SaurabhJha
Copy link
Contributor Author

Can anyone please review this? cc @MechCoder @jnothman @arjoly

@@ -1184,6 +1184,32 @@ def test_hinge_loss_multiclass_invariance_lists():
dummy_hinge_loss)


def test_hinge_loss_multiclass_sample_weight():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have reason to believe the invariance test is insufficient?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right. Invariance test is sufficient here. I will remove this in next commit.

@jnothman
Copy link
Member

jnothman commented Nov 4, 2014

The implementation looks good to me, but I'm not sure the specialised test adds anything.

@coveralls
Copy link

Coverage Status

Coverage increased (+0.0%) when pulling 129733f on SaurabhJha:3450_hinge_loss into 8eee4bc on scikit-learn:master.

@@ -1492,4 +1495,7 @@ def hinge_loss(y_true, pred_decision, labels=None):
losses = 1 - margin
# The hinge_loss doesn't penalize good enough predictions.
losses[losses <= 0] = 0
return np.mean(losses)
if sample_weight is None:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should not be necessary. np.average already handles the case where weights=None

@SaurabhJha
Copy link
Contributor Author

Referencing #3450

@SaurabhJha
Copy link
Contributor Author

Need one more reviewer. @MechCoder Can you please have a look at this?

@MechCoder
Copy link
Member

This looks good, except for the fact that you should validate the first dimension of the pred_decision and sample_weights. This can be done just be adding sample_weight to check_consistent_length

@amueller
Copy link
Member

amueller commented Nov 5, 2014

+1 when addressing @MechCoder's comment.

@MechCoder
Copy link
Member

Merged as d21d6dd . Thanks.

@MechCoder MechCoder closed this Nov 5, 2014
@SaurabhJha
Copy link
Contributor Author

Thanks everyone for the reviews. :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants