Skip to content

Commit

Permalink
Update llmfoundry/models/layers/attention.py
Browse files Browse the repository at this point in the history
Co-authored-by: Daniel King <[email protected]>
  • Loading branch information
ShashankMosaicML and dakinggg authored Dec 20, 2023
1 parent d2602b1 commit d94baa6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llmfoundry/models/layers/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def is_flash_v1_installed():
return version.parse(flash_attn.__version__) < version.parse('2.0.0')


def check_transformers_version(hf_version: str):
def is_transformers_version_gte(hf_version: str) -> bool:
return version.parse(transformers.__version__) >= version.parse(hf_version)


Expand Down

0 comments on commit d94baa6

Please sign in to comment.