Push models to hub with custom embed_layer
#2013
-
Hi ! Let's say I have the following ViT model, which uses a custom model = timm.create_model('vit_tiny_patch16_224', embed_layer=MyEmbedLayer) How is it possible to push it to the hub (using For example, doing the following: timm.models.push_to_hf_hub(
model,
repo_id="vit_tiny_patch16_224.embed_layer_custom_MyEmbedLayer",
private=True,
)
downloaded_model = timm.create_model(
'hf-hub:1aurent/vit_tiny_patch16_224.embed_layer_custom_MyEmbedLayer',
pretrained=True
) results in a error because the weights cannot be mapped properly:
This can obviously can be fixed by doing: downloaded_model = timm.create_model(
'hf-hub:1aurent/vit_tiny_patch16_224.embed_layer_custom_MyEmbedLayer',
pretrained=True,
embed_layer=MyEmbedLayer
) But this requires the users to copy-paste the I know that it's possible to push model with Thanks ! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
@Laurent2916 yes, I don't currently have a means of supporting custom code via the hub like transformers does. I actually havevn't looked into it super closely. If you've dug in and have any ideas / feel it's possible, I'd be open to working on a PR. It'd be useful for this case, and also for supporting 'adapters' that add non-trivial heads to the models, other custom layers that can be configured, etc. I noticed you were uploading some neat timm (& transformers) models to the HF hub, nice! |
Beta Was this translation helpful? Give feedback.
@Laurent2916 yes, I don't currently have a means of supporting custom code via the hub like transformers does. I actually havevn't looked into it super closely. If you've dug in and have any ideas / feel it's possible, I'd be open to working on a PR. It'd be useful for this case, and also for supporting 'adapters' that add non-trivial heads to the models, other custom layers that can be configured, etc.
I noticed you were uploading some neat timm (& transformers) models to the HF hub, nice!