Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Unexpected result from torch.xpu.is_bf16_supported() when XPU is unavailable #152301

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weโ€™ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
defaultd661 opened this issue Apr 28, 2025 · 1 comment
Closed
Assignees
Labels
module: xpu Intel XPU related issues triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@defaultd661
Copy link

defaultd661 commented Apr 28, 2025

๐Ÿ› Describe the bug

When torch.xpu.is_available() returns False, calling torch.xpu.is_bf16_supported() still returns True, which is inconsistent with the expected behavior (should be False).

To Reproduce

import torch
def test_bug():
    print('torch.xpu.is_available() =', torch.xpu.is_available())
    if not torch.xpu.is_available():
        result = torch.xpu.is_bf16_supported()
        print('result =', result)

if __name__ == '__main__':
    test_bug()

Output

torch.xpu.is_available() = False
result = True

Versions

PyTorch version: 2.7.0+cu126
Is debug build: False
CUDA used to build PyTorch: 12.6
ROCM used to build PyTorch: N/A

cc @gujinghui @EikanWang @fengyuan14 @guangyey

@guangyey
Copy link
Collaborator

@defaultd661 Thanks for your feedback. We will fix this issue in #152317

@jerryzh168 jerryzh168 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: xpu Intel XPU related issues triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants