Skip to content

Commit

Permalink
Update attention.py
Browse files Browse the repository at this point in the history
  • Loading branch information
DongHande authored Oct 9, 2023
1 parent 7376b42 commit 34b7ab1
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion optimum/bettertransformer/models/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -747,7 +747,7 @@ def gpt_bigcode_wrapped_scaled_dot_product(
is_causal = True

sdpa_result = torch.nn.functional.scaled_dot_product_attention(
query, key, value, attn_mask=attention_mask, dropout_p=dropout_p, is_causal=is_causal
query, key, value, attn_mask=attn_mask, dropout_p=dropout_p, is_causal=is_causal
)

if self.multi_query:
Expand Down

0 comments on commit 34b7ab1

Please sign in to comment.