Skip to content

Commit

Permalink
cache false
Browse files Browse the repository at this point in the history
  • Loading branch information
dakinggg committed Dec 15, 2023
1 parent 57d2aed commit fccf089
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions llmfoundry/models/hf/hf_causal_lm.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,7 @@ def __init__(self, om_model_config: Union[DictConfig,
trust_remote_code=trust_remote_code,
use_auth_token=use_auth_token,
attn_implementation=requested_attention_implementation,
use_cache=False,
)
# config._flash_attn_2_enabled = use_flash_attention_2

Expand Down

0 comments on commit fccf089

Please sign in to comment.