Unexpected result from torch.xpu.is_bf16_supported()
when XPU is unavailable
#152301
Labels
module: xpu
Intel XPU related issues
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
๐ Describe the bug
When
torch.xpu.is_available()
returnsFalse
, callingtorch.xpu.is_bf16_supported()
still returnsTrue
, which is inconsistent with the expected behavior (should beFalse
).To Reproduce
Output
Versions
PyTorch version: 2.7.0+cu126
Is debug build: False
CUDA used to build PyTorch: 12.6
ROCM used to build PyTorch: N/A
cc @gujinghui @EikanWang @fengyuan14 @guangyey
The text was updated successfully, but these errors were encountered: