Skip to content

Commit

Permalink
Update README.md (#152)
Browse files Browse the repository at this point in the history
If use 2.3.6, there will be an error

ImportError: /root/miniconda3/envs/handbook/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops9_pad_enum4callERKNS_6TensorEN3c108ArrayRefINS5_6SymIntEEElNS5_8optionalIdEE

If we use the newest flash_attn version, there will be no trouble!
  • Loading branch information
ZizhengYang authored Apr 25, 2024
1 parent cf1975a commit 84f8c92
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,11 +78,11 @@ python -m pip install .
You will also need Flash Attention 2 installed, which can be done by running:

```shell
python -m pip install flash-attn==2.3.6 --no-build-isolation
python -m pip install flash-attn --no-build-isolation
```

> **Note**
> If your machine has less than 96GB of RAM and many CPU cores, reduce the `MAX_JOBS` arguments, e.g. `MAX_JOBS=4 pip install flash-attn==2.3.6 --no-build-isolation`
> If your machine has less than 96GB of RAM and many CPU cores, reduce the `MAX_JOBS` arguments, e.g. `MAX_JOBS=4 pip install flash-attn --no-build-isolation`
Next, log into your Hugging Face account as follows:

Expand Down

0 comments on commit 84f8c92

Please sign in to comment.