Skip to content

Commit

Permalink
switch to build from source
Browse files Browse the repository at this point in the history
  • Loading branch information
mvpatel2000 committed Apr 25, 2024
1 parent e86d915 commit 8f0851e
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -283,7 +283,10 @@ RUN if [ -n "$MOFED_VERSION" ] ; then \
RUN if [ -n "$CUDA_VERSION" ] ; then \
pip${PYTHON_VERSION} install --upgrade --no-cache-dir ninja==1.11.1 && \
pip${PYTHON_VERSION} install --upgrade --no-cache-dir --force-reinstall packaging==22.0 && \
MAX_JOBS=1 pip${PYTHON_VERSION} install --no-cache-dir flash-attn==2.5.0; \
git clone --branch v2.5.7 https://github.com/Dao-AILab/flash-attention.git && \
cd flash-attention && \
MAX_JOBS=1 python${PYTHON_VERSION} setup.py install && \
cd .. ; \
fi

###############
Expand Down

0 comments on commit 8f0851e

Please sign in to comment.