Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Immediate To-Dos: #5

Open
pharaouk opened this issue Sep 12, 2023 · 1 comment
Open

Immediate To-Dos: #5

pharaouk opened this issue Sep 12, 2023 · 1 comment

Comments

@pharaouk
Copy link
Contributor

Immediate To-Dos:
Improve the Multilora PEFT class extension code ( @sumo already has an implementation and will push it shortly)
Gating needs to be standardized to enable flexible switching of expert adapters from a larger db of adapters (likely through centroid/similarity measures)
UI to run MoE inference and Base Model inference side by side (w streaming and display of selected experts during inference)
simplifying the process of Finetuning new experts and adding them to the MoE arch

@lpietrobon
Copy link

I'll make separate issues for each of these, so that we can address them one by one

@nisten nisten added this to moe-board Sep 21, 2023
@nisten nisten moved this to 😿Todo in moe-board Sep 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 😿Todo
Development

No branches or pull requests

2 participants