Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support using models from HuggingFace directly #140

Open
samos123 opened this issue Oct 12, 2024 · 2 comments
Open

Support using models from HuggingFace directly #140

samos123 opened this issue Oct 12, 2024 · 2 comments
Assignees

Comments

@samos123
Copy link

I should be able to serve a model by simply providing the HuggingFace model ID. Requiring users to convert checkpoints is too troublesome.

@vipannalla
Copy link
Collaborator

Thanks for the feedback. This feature is currently not supported, but we have added it to our roadmap to simplify. Some models (such as LLama variants) need explicit acknowledgements from Meta's site before you can use them.

@vipannalla vipannalla self-assigned this Oct 30, 2024
@samos123
Copy link
Author

That can be handled by respecting HF_TOKEN environment variable to automatically download auth gated models. That's how vLLM and other OSS does it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants