Skip to content

Commit

Permalink
Merge pull request #88 from casper-hansen/fix_padding_mask
Browse files Browse the repository at this point in the history
Fix unexpected keyword
  • Loading branch information
casper-hansen authored Oct 2, 2023
2 parents 3fa7400 + d016c51 commit a4ea423
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion awq/modules/fused/attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,8 @@ def _get_attention_shapes(self, attention_shapes, max_seq_len):

def forward(
self,
hidden_states:torch.Tensor, past_key_value=None, attention_mask=None, position_ids=None, output_attentions=False, use_cache=False
hidden_states:torch.Tensor, past_key_value=None, attention_mask=None, position_ids=None,
output_attentions=False, use_cache=False, *args, **kwargs
):
bsz, seqlen, _ = hidden_states.shape
if bsz != self.cache_batch_size:
Expand Down

0 comments on commit a4ea423

Please sign in to comment.