You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, in the equation 4 of the paper, there is a multiplication of MLP(C_q) and PE(x_q, y_q). But in the specific implementation, the query_pos (PE(x_q, y_q)) is directly input to the cross-attention and not multiplied by MLP(C_q). Instead, the pos_transformation is multiplied by MLP(C_q) two times, which seems to be inconsistent to the paper. Is there something wrong in my understanding?
The text was updated successfully, but these errors were encountered:
Hi, in the equation 4 of the paper, there is a multiplication of
MLP(C_q)
andPE(x_q, y_q)
. But in the specific implementation, thequery_pos
(PE(x_q, y_q)
) is directly input to the cross-attention and not multiplied byMLP(C_q)
. Instead, thepos_transformation
is multiplied byMLP(C_q)
two times, which seems to be inconsistent to the paper. Is there something wrong in my understanding?The text was updated successfully, but these errors were encountered: