-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[moe] support low level zero optim #4429
Conversation
The code coverage for the changed files is 91%. Click me to view the complete report
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By the way, can you create a moe param interface like dtensor? https://github.com/hpcaitech/ColossalAI/blob/main/colossalai/tensor/d_tensor/api.py
The code coverage for the changed files is 71%. Click me to view the complete report
|
done |
The code coverage for the changed files is 71%. Click me to view the complete report
|
* update optim * update grad handler * update moe param interface * update doc * move moe tensor
* update optim * update grad handler * update moe param interface * update doc * move moe tensor
* update optim * update grad handler * update moe param interface * update doc * move moe tensor
📌 Checklist before creating the PR
[doc/gemini/tensor/...]: A concise description
🚨 Issue number
📝 What does this PR do?
💥 Checklist before requesting a review
⭐️ Do you enjoy contributing to Colossal-AI?
Tell us more if you don't enjoy contributing to Colossal-AI.