You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
assertnotexists(attn_bias), 'attention bias not supported for flash attention'
Why not to add the bias and the mask and create an attn_mask of type float and supply it to the scaled_dot_product_attention as attn_mask? is that not the same as we do where not using flash attention?
The text was updated successfully, but these errors were encountered:
amitaie
changed the title
Qusetion about 'attention bias not supported for flash attention'
Question about 'attention bias not supported for flash attention'
Sep 14, 2023
audiolm-pytorch/audiolm_pytorch/attend.py
Line 112 in 879f3bd
Why not to add the bias and the mask and create an attn_mask of type float and supply it to the scaled_dot_product_attention as attn_mask? is that not the same as we do where not using flash attention?
The text was updated successfully, but these errors were encountered: