Thanks to visit codestin.com
Credit goes to github.com

Skip to content

DOC Ensures that top_k_accuracy_score passes numpydoc validation #24259

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

vitaliset
Copy link
Contributor

Reference Issues/PRs

This pull request addresses #21350.

What does this implement/fix? Explain your changes.

  1. Deleted an extra blank line at the end of the docstring.
  2. Updated the section "See Also" instead of "See also".
  3. Added a description to the "accuracy_score" inside the "See Also" section.
  4. Removed the sklearn.metrics._ranking.top_k_accuracy_score string from FUNCTION_DOCSTRING_IGNORE_LIST at docstring's test file.

Any other comments?

When I was looking for what exactly to add for the accuracy_score description I did a grep -r "accuracy_score : " sklearn/metrics and found this 3 instances where accuracy_score appears on "See Also" section (respectively for jaccard_score, zero_one_loss and hamming_loss):

See Also
--------
accuracy_score : Function for calculating the accuracy score.

See Also
--------
accuracy_score : Compute the accuracy score. By default, the function will
return the fraction of correct predictions divided by the total number
of predictions.

See Also
--------
accuracy_score : Compute the accuracy score. By default, the function will
return the fraction of correct predictions divided by the total number
of predictions.

As the zero_one_loss and hamming_loss description of the accuracy_score looked more complete I decided to go with theirs for now. To be honest, I don't know what would be the best pattern you prefer. A loat of this "See Also" description are very straightforward, such as:

jaccard_score : Compute the Jaccard similarity coefficient score.

Also, should I update the jaccard_score's accuracy_score description to match? I'm just raising this because I think it would make sense if all the "See Also" descriptions were equal. But I don't know if you are concerned about this.

Thanks in advance for the reviews! :D

Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR! LGTM

Also, should I update the jaccard_score's accuracy_score description to match?

I would welcome this update in another PR. When merging, we squash PRs where the title becomes the commit message, so we prefer PRs to be as self contained as possible.

@thomasjpfan thomasjpfan merged commit 997ede5 into scikit-learn:main Aug 26, 2022
@vitaliset vitaliset changed the title DOC Ensures that top_k_accuracy_score passes numpydoc DOC Ensures that top_k_accuracy_score passes numpydoc validation Aug 27, 2022
@vitaliset vitaliset deleted the numpy_doc_validation_top_k_accuracy_score branch August 29, 2022 12:14
glemaitre pushed a commit to glemaitre/scikit-learn that referenced this pull request Sep 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants