Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@rohansjoshi
Copy link
Contributor

Summary:
Added two observers, AffineQuantizedFixedQParamObserver (which allows manual range setting) and AffineQuantizedMSEObserver (which implements MSE range setting during the first forward pass)
Bugfix in quant_primitives

Reviewed By: jerryzh168

Differential Revision: D77906174

@pytorch-bot
Copy link

pytorch-bot bot commented Jul 8, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/2508

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 8, 2025
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77906174

Copy link
Contributor

@jerryzh168 jerryzh168 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please fix CI failures before landing

@rohansjoshi rohansjoshi added topic: bug fix Use this tag for PRs that fix bugs topic: new feature Use this tag if this PR adds a new feature labels Jul 8, 2025
rohansjoshi added a commit to rohansjoshi/ao that referenced this pull request Jul 9, 2025
Summary:

Added two observers, AffineQuantizedFixedQParamObserver (which allows manual range setting) and AffineQuantizedMSEObserver (which implements MSE range setting during the first forward pass)
Bugfix in quant_primitives

Reviewed By: jerryzh168

Differential Revision: D77906174
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77906174

rohansjoshi added a commit to rohansjoshi/ao that referenced this pull request Jul 9, 2025
Summary:

Added two observers, AffineQuantizedFixedQParamObserver (which allows manual range setting) and AffineQuantizedMSEObserver (which implements MSE range setting during the first forward pass)
Bugfix in quant_primitives

Reviewed By: jerryzh168

Differential Revision: D77906174
rohansjoshi added a commit to rohansjoshi/ao that referenced this pull request Jul 9, 2025
Summary:
Pull Request resolved: pytorch#2508

Added two observers, AffineQuantizedFixedQParamObserver (which allows manual range setting) and AffineQuantizedMSEObserver (which implements MSE range setting during the first forward pass)
Bugfix in quant_primitives

Reviewed By: jerryzh168

Differential Revision: D77906174
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77906174

Summary:

Added two observers, AffineQuantizedFixedQParamObserver (which allows manual range setting) and AffineQuantizedMSEObserver (which implements MSE range setting during the first forward pass)
Bugfix in quant_primitives

Reviewed By: jerryzh168

Differential Revision: D77906174
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77906174

@rohansjoshi rohansjoshi merged commit 19c009d into pytorch:main Jul 9, 2025
11 of 18 checks passed
liangel-02 pushed a commit that referenced this pull request Aug 25, 2025
TorchAO new observers (#2508)

Summary:

Added two observers, AffineQuantizedFixedQParamObserver (which allows manual range setting) and AffineQuantizedMSEObserver (which implements MSE range setting during the first forward pass)
Bugfix in quant_primitives

Reviewed By: jerryzh168

Differential Revision: D77906174
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported topic: bug fix Use this tag for PRs that fix bugs topic: new feature Use this tag if this PR adds a new feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants