-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPT2 CasualLM Inference crashes when using transformers v4.39.0 #6991
Comments
|
I was able to repo it with
but this is also not supported in pytorch
does above codes works in pytorch native gpu env? |
@JackCaoG interesting ... I saw these changes huggingface/transformers#29334 were part of the v4.39.0 release. Do you think it can be related? |
yea.. I can repo this issue, let me look into it a bit.. |
Ah ok, the issue is from
Let me fix it... |
This should be fixed now, let me close the issue. |
🐛 Bug
When running LLM inference with gpt2 model using HF transformers , upgrading to transformers v4.39.0 leads to the following error:
To Reproduce
Example test code:
Expected behavior
Reverting to the previous version
transformers==4.38.0
fixes the errors and the inference runs fine.Environment
Additional context
Full error log is in the comments.
The text was updated successfully, but these errors were encountered: