Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[test_models_transformer_ltx.py] help us test torch.compile() for impactful models #11512

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

cjfghk5697
Copy link
Contributor

@cjfghk5697 cjfghk5697 commented May 7, 2025

What does this PR do?

Part of #11430

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! One comment. Could you confirm if running the test successfully passes?

Comment on lines 92 to 106
@require_torch_gpu
@require_torch_2
@is_torch_compile
@slow
def test_torch_compile_recompilation_and_graph_break(self):
torch._dynamo.reset()
init_dict, inputs_dict = self.prepare_init_args_and_inputs_for_common()

model = self.model_class(**init_dict).to(torch_device)
model.eval()
model = torch.compile(model, fullgraph=True)

with torch._dynamo.config.patch(error_on_recompile=True), torch.no_grad():
_ = model(**inputs_dict)
_ = model(**inputs_dict)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We just need to add: TorchCompileTesterMixin like this:

class FluxTransformerTests(ModelTesterMixin, TorchCompileTesterMixin, unittest.TestCase):

Copy link
Contributor Author

@cjfghk5697 cjfghk5697 May 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sayakpaul Thanks for the suggestion! I've added TorchCompileTesterMixin and confirmed that the test runs successfully(pytest).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. How did you run the tests?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

I ran the tests using:

os.environ["RUN_COMPILE"] = "1"
os.environ["RUN_SLOW"] = "1"
!pytest tests/models/transformers/test_models_transformer_ltx.py -k "test_torch_compile_recompilation_and_graph_break"

Let me know if there's anything else I should verify.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're already including TorchCompileTesterMixin to the tester class. We can completely remove test_torch_compile_recompilation_and_graph_break().

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right — since TorchCompileTesterMixin already covers the necessary torch.compile checks, I've removed the redundant test_torch_compile_recompilation_and_graph_break() method.

@sayakpaul
Copy link
Member

sayakpaul commented May 7, 2025

Just ran the tests myself and they're passing. Thanks! Also, cc: @a-r-r-o-w for awareness on LTX.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants