Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2.3 backport PR request list #6676

Closed
lsy323 opened this issue Mar 6, 2024 · 28 comments
Closed

2.3 backport PR request list #6676

lsy323 opened this issue Mar 6, 2024 · 28 comments
Assignees

Comments

@lsy323
Copy link
Collaborator

lsy323 commented Mar 6, 2024

Similar to the backport request thread in 2.2 release #6036

The issue is to track 2.3 release backport.

For any PRs you want to backport to 2.3, please reply with following:

Original PR link
Reason to backport
2.3 backport PR link

@zpcore
Copy link
Collaborator

zpcore commented Mar 13, 2024

Original PR link:

Reason to backport: Minor bug fixing for benchmark.

2.3 backport PR link:

@lsy323
Copy link
Collaborator Author

lsy323 commented Mar 13, 2024

@lsy323
Copy link
Collaborator Author

lsy323 commented Mar 13, 2024

Original PR link:

2.3 backport PR link:

@alanwaketan
Copy link
Collaborator

alanwaketan commented Mar 13, 2024

Original PR link:

Reason to backport: this is a new important feature for 2.3 which enables to use custom kernel like flash attention.

2.3 backport PR link:

@wonjoolee95
Copy link
Collaborator

wonjoolee95 commented Mar 13, 2024

Original PR link:

Reason to backport: This is a series of important fixes to lowerings to reduce unnecessary decomposition.

2.3 backport PR link

  • TBD

@lsy323
Copy link
Collaborator Author

lsy323 commented Mar 13, 2024

The Above PRs goes into the 1st rc tag https://github.com/pytorch/xla/tree/v2.3.0-rc2

@bhavya01
Copy link
Collaborator

bhavya01 commented Mar 14, 2024

Original PR link:

Reason to backport: Part of a series of important fixes to lowerings to reduce unnecessary decomposition.

2.3 backport PR link

@yeounoh
Copy link
Contributor

yeounoh commented Mar 15, 2024

@JackCaoG
Copy link
Collaborator

JackCaoG commented Mar 18, 2024

Original PR:

Reason to backport:
route user to use openxla dyanmo backend instead of openxla_eval

2.3 backport PR link:

@jonb377
Copy link
Collaborator

jonb377 commented Mar 20, 2024

Original PR:

Reason to backport:

  • Address deprecation warnings when using distributed checkpointing

2.3 backport PR link:

@JackCaoG
Copy link
Collaborator

Original PR

Reason to backport:
AWS folks need this change.

backport PR

@alanwaketan
Copy link
Collaborator

alanwaketan commented Mar 21, 2024

Original PR

Reason to backport:
To support Pallas.

backport PR

@JackCaoG
Copy link
Collaborator

@ManfeiBai
Copy link
Collaborator

ManfeiBai commented Mar 22, 2024

Original PR
- #6807

Reason to backport: to complex test case for feature fori_loop
will not backport due to only contain test case

@alanwaketan
Copy link
Collaborator

Original PR

Reason to backport:
To support Pallas.

backport PR

@yeounoh
Copy link
Contributor

yeounoh commented Mar 22, 2024

Original PR: #6797
Reason to backport: auto-sharding fix in XLA
Backport PR: #6811

@alanwaketan
Copy link
Collaborator

Original PR

Reason to backport:
To support Pallas.

backport PR

@alanwaketan
Copy link
Collaborator

Original PR

Reason to backport:
To support Pallas.

backport PR

@alanwaketan
Copy link
Collaborator

Original PR

Reason to backport:
To support Pallas.

backport PR

@wonjoolee95
Copy link
Collaborator

Original PR

Reason to backport:
To fix a bug to remove unrelated warning message.

Backport PR

@yeounoh
Copy link
Contributor

yeounoh commented Mar 29, 2024

@alanwaketan
Copy link
Collaborator

@lsy323
Copy link
Collaborator Author

lsy323 commented Apr 3, 2024

@JackCaoG
Copy link
Collaborator

JackCaoG commented Apr 4, 2024

Original PR

Reason to backport:
Make support for flash attention Pallas Kernel complete. If user can only use flash attention with LTC it won't be very useful for the inference.

Backport PR

@JackCaoG
Copy link
Collaborator

JackCaoG commented Apr 5, 2024

Original PR

Reason to backport:
Without this change, we will always throw a warning message about dev_kind during mp training which is a bad ux.

Backport pr

@ManfeiBai
Copy link
Collaborator

ManfeiBai commented Apr 19, 2024

Original PR:

Reason to backport: add doc for while_loop feature

Backport PR:

@lsy323
Copy link
Collaborator Author

lsy323 commented Apr 23, 2024

@lsy323 lsy323 closed this as completed May 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants