Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Adan Optimizer #1729

Open
cvnad1 opened this issue Dec 21, 2024 · 0 comments
Open

[Feature] Adan Optimizer #1729

cvnad1 opened this issue Dec 21, 2024 · 0 comments

Comments

@cvnad1
Copy link
Contributor

cvnad1 commented Dec 21, 2024

@angeloskath

I would like to add the latest improvement to Adam Optimizer which is The ADAptive Nesterov momentum algorithm(Adan) Optimizer which is reference implementation from paper.

I believe this will be a great addition to the MLX framework given its superior performance than Adam and given the fact it is already available in Optax.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant