Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Remove redundant type aliases of _device_t for torch.Device (#152952) #153007

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

sanjai-11
Copy link

@sanjai-11 sanjai-11 commented May 7, 2025

Fixes #152952

This PR removes redundant type aliases for _device_t and replaces them with torch.types.Device where applicable, to make the typing system more consistent across PyTorch.

cc @jgong5 @mingfeima @XiaobingSuper @sanchitintel @ashokei @jingxu10 @jerryzh168

Copy link

pytorch-bot bot commented May 7, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/153007

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure, 1 Unrelated Failure

As of commit 2173423 with merge base 5fa5017 (image):

NEW FAILURE - The following job has failed:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the module: cpu CPU specific problem (e.g., perf, algorithm) label May 7, 2025
@sanjai-11
Copy link
Author

@pytorchbot label "topic: not user facing"

@pytorch-bot pytorch-bot bot added the topic: not user facing topic category label May 7, 2025
@@ -23,7 +23,7 @@ def empty_cache() -> None:
torch._C._xpu_emptyCache()


def reset_peak_memory_stats(device: _device_t = None) -> None:
def reset_peak_memory_stats(device: torch.types.Device = None) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe line 10 is no longer needed.

_device_t = Union[Device, str, int, None]

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the review. Will change

@sanjai-11 sanjai-11 requested a review from shink May 7, 2025 04:15
Copy link
Contributor

@shink shink left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@janeyx99 janeyx99 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label May 7, 2025
Copy link
Contributor

@janeyx99 janeyx99 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure

@janeyx99 janeyx99 added the suppress-bc-linter Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) label May 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: cpu CPU specific problem (e.g., perf, algorithm) open source suppress-bc-linter Suppresses the failures of API backward-compatibility linter (Lint/bc_linter) topic: not user facing topic category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove redundant type aliases of _device for torch.Device
4 participants