Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[WIP][dynamic shapes] mark backed size symbols as size-like #146335

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

pianpwk
Copy link
Contributor

@pianpwk pianpwk commented Feb 3, 2025

Copy link

pytorch-bot bot commented Feb 3, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/146335

Note: Links to docs will display an error until the docs builds have been completed.

❌ 23 New Failures, 1 Unrelated Failure

As of commit 9b35b2e with merge base ed9624e (image):

NEW FAILURES - The following jobs have failed:

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pianpwk pianpwk changed the title init [dynamic shapes][WIP] mark backed size symbols as size-like Feb 5, 2025
@facebook-github-bot
Copy link
Contributor

@pianpwk has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Feb 5, 2025
@pianpwk pianpwk changed the title [dynamic shapes][WIP] mark backed size symbols as size-like [WIP][dynamic shapes] mark backed size symbols as size-like Mar 3, 2025
Copy link

linux-foundation-easycla bot commented Mar 5, 2025

CLA Not Signed

@bobrenjc93
Copy link
Contributor

A few thoughts

  1. I think the flag should be something like backed_size_oblivious instead of specialize_zero_one. That flag should 1) turn off 0/1 specialization 2) turn on size oblivious reasoning of backed
  2. I don't think you should do all of this threading of the flag. Within guard_size_oblivious just read the config directly eg. if torch._dynamo.backed_size_oblivious: .
  3. I think this change can be quite surgical. It'd involve

a) What you already have disabling self.val_to_var = {0: sympy.S.Zero, 1: sympy.S.One}
b) Change if lower is -int_oo or (unbacked_only and val is not None) or not vr.is_int: within _maybe_evaluate_static_worker to use this new flag as well

  1. Instead of setting it to be True for all of export, which will likely cause a LOT of wobbling, don't do that and only set it True when testing Colin's model for now. eg. in his user code he'd do something like
torch._dynamo.config.backed_size_oblivious = True

You can also write some custom test cases where you patch this flag to be True to verify correctness of your implementation

Copy link
Contributor

github-actions bot commented May 5, 2025

Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as Stale.
Feel free to remove the Stale label if you feel this was a mistake.
If you are unable to remove the Stale label please contact a maintainer in order to do so.
If you want the bot to never mark this PR stale again, add the no-stale label.
Stale pull requests will automatically be closed after 30 days of inactivity.

@github-actions github-actions bot added the Stale label May 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants