-
Notifications
You must be signed in to change notification settings - Fork 488
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update test_export_fx_passes.py #6972
Conversation
pytorch/pytorch#124898 makes some changes to how `args` and `dynamic_shapes` are matched in `torch.export`, which will fail some XLA tests. This PR avoids those failures.
@@ -18,7 +18,7 @@ class ExportFxPassTest(unittest.TestCase): | |||
|
|||
def test_decompose_dynamic_shape_select(self): | |||
args = (torch.rand((10, 197, 768)), 1, 0) | |||
dynamic_shapes = ([{0: Dim("bs")}, None, None],) | |||
dynamic_shapes = (({0: Dim("bs")}, None, None),) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrap_func_as_nn_module
creates a module with input signature args: tuple(...)
and thus dynamic_shapes
needs to match with a tuple
as well.
@@ -55,7 +55,7 @@ def forward(self, x): | |||
def test_embedding_indices_flatten(self): | |||
args = (torch.rand((20, 768)), torch.randint(0, 15, | |||
(3, 10)).to(torch.int64)) | |||
dynamic_shapes = ([None, {0: Dim("bs")}],) | |||
dynamic_shapes = ((None, {0: Dim("bs")}),) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as above
@lsy323 for review? |
@lsy323 Can you review this one? |
Ping! This is blocking land of pytorch/pytorch#124898 |
CI failed at build step so it is hard for me to see whether we can merge this pr without upstream one. From the change it seems like we do require upstream? |
Ah.. you are using a fork of the pytorch/xla so we can not pin upstream.. Let me first verify if your changes works. If so we can just disable these two tests and merge the pr. I can enable the test with your change afterward. |
It shouldn't need the upstream change I think. But plan sounds good (confirm and disable temporarily). Thanks! |
Ah I confirmed that it worked locally even without upstream change. I will just approve and merge this pr. |
Thanks @JackCaoG |
pytorch/pytorch#124898 makes some changes to how
args
anddynamic_shapes
are matched intorch.export
, which will fail some XLA tests. This PR avoids those failures.