Replies: 5 comments 6 replies
-
We're facing the same issue, any feedback here would be appreciated. Running vLLM online is currently not possible for us... |
Beta Was this translation helpful? Give feedback.
-
same issue, i'm offline 99% of the time and VLLM is not working offline :(. |
Beta Was this translation helpful? Give feedback.
-
Using absolute path to load the model, rather than the name on huggingface, works for me. For example,
|
Beta Was this translation helpful? Give feedback.
-
It works but the generated responses are wrong most of the time. |
Beta Was this translation helpful? Give feedback.
-
Setting the hugging face transformers module to offline mode (via a global environment variable) worked for me: |
Beta Was this translation helpful? Give feedback.
-
I cloned the model repository on Hugging Face to my local machine and used the
--download-dir
parameter to specify the directory. However, when running VLLM, it still tries to connect to Hugging Face, which doesn't work without an internet connection. Even after settingexport HF_HUB_OFFLINE=1
, offline mode doesn't seem to be working. Is it possible to run VLLM offline and if so, how can I achieve this?Beta Was this translation helpful? Give feedback.
All reactions