Skip to content

Commit

Permalink
..
Browse files Browse the repository at this point in the history
  • Loading branch information
ShashankMosaicML committed Nov 27, 2024
1 parent 2ae6027 commit 0c5150a
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions llmfoundry/models/layers/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -863,12 +863,6 @@ def __init__(
'flex_attn_config must be provided for flex attention.',
)
self.flex_attn_config = flex_attn_config
self.compiled_flex_attention = self.flex_attn_config.pop(
'compiled_flex_attention',
)
self.compiled_create_block_mask = self.flex_attn_config.pop(
'compiled_create_block_mask',
)

def forward(
self,
Expand Down

0 comments on commit 0c5150a

Please sign in to comment.