Skip to content

Commit

Permalink
fix triton matmul
Browse files Browse the repository at this point in the history
  • Loading branch information
king-menin committed Nov 17, 2021
1 parent 50eee08 commit e186fcd
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/mpu/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,9 +125,9 @@ def forward(self, hidden_states, ltor_mask):

if self.use_deepspeed_sparse:
context_layer = self.sparse_self_attention(
query_layer,
key_layer,
value_layer,
query_layer.float(),
key_layer.float(),
value_layer.float(),
attn_mask=ltor_mask)
else:
# Raw attention scores. [b, np, s, s]
Expand Down

0 comments on commit e186fcd

Please sign in to comment.