You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to understand what's going on with thread groups in the CUDA kernel. Why are they used, etc. I've been studying it for a bit, but I don't understand how THREAD_GROUP_SIZE works. Why is it set the way it is?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I'm trying to understand what's going on with thread groups in the CUDA kernel. Why are they used, etc. I've been studying it for a bit, but I don't understand how
THREAD_GROUP_SIZE
works. Why is it set the way it is?vllm/csrc/attention/attention_kernels.cu
Line 116 in 1b290ac
Beta Was this translation helpful? Give feedback.
All reactions