Skip to content

Commit

Permalink
Fix unexpected keyword
Browse files Browse the repository at this point in the history
  • Loading branch information
casper-hansen committed Oct 2, 2023
1 parent 3fa7400 commit d016c51
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion awq/modules/fused/attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,8 @@ def _get_attention_shapes(self, attention_shapes, max_seq_len):

def forward(
self,
hidden_states:torch.Tensor, past_key_value=None, attention_mask=None, position_ids=None, output_attentions=False, use_cache=False
hidden_states:torch.Tensor, past_key_value=None, attention_mask=None, position_ids=None,
output_attentions=False, use_cache=False, *args, **kwargs
):
bsz, seqlen, _ = hidden_states.shape
if bsz != self.cache_batch_size:
Expand Down

0 comments on commit d016c51

Please sign in to comment.