Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gated models yield ValueError that states "model cannot be found" #5

Open
MinaAlmasi opened this issue Oct 30, 2024 · 0 comments
Open
Labels
bug Something isn't working

Comments

@MinaAlmasi
Copy link

Trying to run "meta-llama/Llama-3.2-1B", but getting an error msg saying:

ValueError: Model meta-llama/Llama-3.2-1B cannot be found in HuggingFace repositories, nor could an OpenAI model be initialized.

When running with the regular pipeline setup on huggingface (from transformers import pipeline), it gives the correct response "this is a gated model"

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B is restricted. You must have access to it and be authenticated to access it. Please log in.

Versions:
python 3.10.12
transformers 4.40.2
stormtrooper 1.0.0

@MinaAlmasi MinaAlmasi added the bug Something isn't working label Oct 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant