Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[MRG+1] Use the Gram Variant when precompute is True #3247

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 5, 2014

Conversation

MechCoder
Copy link
Member

The algorithm cd_fast.enet_coordinate_descent_gram is unused even when.

a] precompute is True
b] precompute is "auto" and n_samples >> n_features.

This PR fixes this issue.

@MechCoder MechCoder changed the title Use the Gram Variant when precompute is True [MRG] Use the Gram Variant when precompute is True Jun 5, 2014
@ogrisel
Copy link
Member

ogrisel commented Jun 5, 2014

Looks good to me, +1 for merge. @agramfort already gave a +1 for this in the parent PR so I think we can merge.

This bug was present in 0.14.1 I think, do you confirm @MechCoder? If so I will add a whats new entry when I merge.

@ogrisel ogrisel changed the title [MRG] Use the Gram Variant when precompute is True [MRG+1] Use the Gram Variant when precompute is True Jun 5, 2014
@agramfort
Copy link
Member

add what's new then merge

@ogrisel ogrisel merged commit 7b75ebb into scikit-learn:master Jun 5, 2014
@ogrisel
Copy link
Member

ogrisel commented Jun 5, 2014

Done! Thanks @MechCoder!

@MechCoder MechCoder deleted the use_precompute branch June 5, 2014 12:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants