Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

namgyu-youn
Copy link
Contributor

@namgyu-youn namgyu-youn commented Sep 3, 2025

Summary:
For expanding more quantization API tests, parameterize quantization APIs in the AWQ unittest, preserving structure.

Test plan: CI - test/prototype/test_awq.py

Copy link

pytorch-bot bot commented Sep 3, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/2930

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 3, 2025
@namgyu-youn namgyu-youn marked this pull request as draft September 3, 2025 18:14
@namgyu-youn namgyu-youn marked this pull request as ready for review September 7, 2025 16:01
@namgyu-youn
Copy link
Contributor Author

namgyu-youn commented Sep 13, 2025

@jerryzh168 @Xia-Weiwen Sorry for the interruption, could you review this PR?

base_config = Int4WeightOnlyConfig()
class TestAWQ(common_utils.TestCase):
@common_utils.parametrize("base_config", [Int4WeightOnlyConfig()])
def test_awq_config(self, base_config):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this one is fine not to parametrize I think

@@ -61,19 +58,20 @@ def test_awq_config(self):
with self.assertRaisesRegex(ValueError, "is not one of"):
AWQConfig(base_config, step="not_supported")

def test_awq_functionality(self):
@common_utils.parametrize(
"base_config", [Int4WeightOnlyConfig(group_size=128, version=2)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: version=2 can be removed now

@@ -104,12 +102,14 @@ def test_awq_functionality(self):
loss_base = (ref_out - baseline_out).pow(2).mean().item()
assert loss_awq < loss_base

def test_awq_loading(self):
@common_utils.parametrize(
"base_config", [Int4WeightOnlyConfig(group_size=128, version=2)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, version=2 can be removed now

@@ -152,7 +151,10 @@ def test_awq_loading(self):
assert awq_save_load_out is not None
assert torch.allclose(awq_out, awq_save_load_out, atol=1e-2)

def test_awq_loading_vllm(self):
@common_utils.parametrize(
"base_config", [Int4WeightOnlyConfig(group_size=128, version=2)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

@namgyu-youn
Copy link
Contributor Author

Not sure how was #2997 merged into main, destroying test_awq.py architecture, even removing unittest totally. Parametrizing AWQ was a useless effort, so closing this PR should be fine.

@namgyu-youn namgyu-youn deleted the awq-unittest branch September 17, 2025 16:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants