Skip to content

Attention modules and pretrained networks #1004

Discussion options

You must be logged in to vote

@eovallemagallanes if you pass different args that essentially change the model architecture there won't be pretrained weights for that... only defined model configs that have urls set for their weights can be used with pretrained flag.

ECA attention as an example, the resnet class lets you specify attention modules https://github.com/rwightman/pytorch-image-models/blob/f7d210d759beb00a3d0834a3ce2d93f6e17f3d38/timm/models/resnet.py#L1243

You can use anythign in https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/create_attn.py although the ResNet setup is designed for channel attention like modules such as SE / ECA / etc that aren't too large

Byob/ByoaNet are …

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by eovallemagallanes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants