Replies: 1 comment
-
Never mind I just failed to send the new updated image to my cloud provider. Plus some other cloud specific issues. Fixed now. If you run into this error just update and you shouldn't anymore. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello community!
So it's the first time I try to serve a Mixtral model specifically
TheBloke/Nous-Hermes-2-Mixtral-8x7B-DPO-AWQ
with vLLM.Only issue, transformers raises the following error:
I've not found much on the net about a similar issue. I found https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/discussions/30 and https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/discussions/9 which both suggest to update to
transformers>=4.36.1
but I'm using:Also I'm using vLLM v0.3.0. ( I saw errors with mixtral and vLLM before v0.2.1 but still same error...)
Has someone already experienced such an issue with Mixtral and vLLM? How did you solve it?
Beta Was this translation helpful? Give feedback.
All reactions