Skip to content

Commit

Permalink
log when xentropy is not found
Browse files Browse the repository at this point in the history
  • Loading branch information
tmm1 committed Sep 4, 2023
1 parent cdeba07 commit abd4f9a
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion src/axolotl/monkeypatch/llama_attn_hijack_flash.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,9 @@ def replace_llama_attn_with_flash_attn(packed: Optional[bool] = False):
CrossEntropyLoss, inplace_backward=True
)
except ImportError:
pass
LOG.info(
"optimized flash-attention CrossEntropyLoss not found (run `pip install git+https://github.com/Dao-AILab/flash-attention.git#egg=xentropy_cuda_lib&subdirectory=csrc/xentropy`)"
)


# Disable the transformation of the attention mask in LlamaModel as the flash attention
Expand Down

0 comments on commit abd4f9a

Please sign in to comment.