Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update test_export_fx_passes.py #6972

Merged
merged 1 commit into from
Apr 29, 2024
Merged

Conversation

avikchaudhuri
Copy link
Contributor

pytorch/pytorch#124898 makes some changes to how args and dynamic_shapes are matched in torch.export, which will fail some XLA tests. This PR avoids those failures.

pytorch/pytorch#124898 makes some changes to how `args` and `dynamic_shapes` are matched in `torch.export`, which will fail some XLA tests. This PR avoids those failures.
@@ -18,7 +18,7 @@ class ExportFxPassTest(unittest.TestCase):

def test_decompose_dynamic_shape_select(self):
args = (torch.rand((10, 197, 768)), 1, 0)
dynamic_shapes = ([{0: Dim("bs")}, None, None],)
dynamic_shapes = (({0: Dim("bs")}, None, None),)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wrap_func_as_nn_module creates a module with input signature args: tuple(...) and thus dynamic_shapes needs to match with a tuple as well.

@@ -55,7 +55,7 @@ def forward(self, x):
def test_embedding_indices_flatten(self):
args = (torch.rand((20, 768)), torch.randint(0, 15,
(3, 10)).to(torch.int64))
dynamic_shapes = ([None, {0: Dim("bs")}],)
dynamic_shapes = ((None, {0: Dim("bs")}),)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above

@avikchaudhuri
Copy link
Contributor Author

@lsy323 for review?

@JackCaoG JackCaoG requested a review from lsy323 April 25, 2024 16:46
@JackCaoG
Copy link
Collaborator

@lsy323 Can you review this one?

@avikchaudhuri
Copy link
Contributor Author

Ping! This is blocking land of pytorch/pytorch#124898

@JackCaoG
Copy link
Collaborator

CI failed at build step so it is hard for me to see whether we can merge this pr without upstream one. From the change it seems like we do require upstream?

@JackCaoG
Copy link
Collaborator

Ah.. you are using a fork of the pytorch/xla so we can not pin upstream.. Let me first verify if your changes works. If so we can just disable these two tests and merge the pr. I can enable the test with your change afterward.

@avikchaudhuri
Copy link
Contributor Author

It shouldn't need the upstream change I think. But plan sounds good (confirm and disable temporarily). Thanks!

@JackCaoG
Copy link
Collaborator

Ah I confirmed that it worked locally even without upstream change. I will just approve and merge this pr.

@JackCaoG JackCaoG merged commit 7527816 into pytorch:master Apr 29, 2024
10 of 12 checks passed
@avikchaudhuri
Copy link
Contributor Author

Thanks @JackCaoG

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants