You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello authors @houqb, thanks for your hard work! After reading the paper and seeing the diagrams, I wonder if it is a good idea to combine both CA and SE and make the attention module into a 3-branch one. Have the team conducted any experiments on such an idea? Thanks!
The text was updated successfully, but these errors were encountered:
Hello authors @houqb, thanks for your hard work! After reading the paper and seeing the diagrams, I wonder if it is a good idea to combine both CA and SE and make the attention module into a 3-branch one. Have the team conducted any experiments on such an idea? Thanks!
The text was updated successfully, but these errors were encountered: