Skip to content

Commit

Permalink
[Examples] Specify version for vllm cuz vllm v0.6.4.post1 has issue (#…
Browse files Browse the repository at this point in the history
…4391)

* [OCI] Specify vllm version because the latest vllm v0.6.4.post1 has issue

* version for vllm-flash-attn
  • Loading branch information
HysunHe authored Nov 21, 2024
1 parent 627be72 commit ecf5b00
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions examples/oci/serve-qwen-7b.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@ resources:
setup: |
conda create -n vllm python=3.12 -y
conda activate vllm
pip install vllm
pip install vllm-flash-attn
pip install vllm==0.6.3.post1
pip install vllm-flash-attn==2.6.2
run: |
conda activate vllm
Expand Down

0 comments on commit ecf5b00

Please sign in to comment.