How to change all Relu to TLU in pretrained models? #375
Replies: 2 comments
-
@mobassir94 For most models here, with any standard activation you can just |
Beta Was this translation helpful? Give feedback.
-
@rwightman any plan on releasing some pretrained models using frn+tlu? not everyone got fancy hardware and can't pass large batch size for reducing error rate,if you could release some pretrained models that uses frn+tlu then a lot of deep learning practitioner like me could do well,some of your pre trained models are very nice,i would like to use some of them on my hardware 3060ti(8gb vram) at least if you could provide a step by step tutorial like how we can do frn and tlu on your pretrained weights then it would be very helpful. |
Beta Was this translation helpful? Give feedback.
-
i was trying to replace all the Relu activation functions with TLU of skresnet34 model as shown here : https://github.com/yukkyo/PyTorch-FilterResponseNormalizationLayer/blob/master/frn.py
and in filter response normalization layer paper.
i was trying this :
it replaces all relu with TLU but i can't pass exact value of num_features in TLU while replacing relu with tlu,how to modify this line of code :
for changing all relu with tlu?
Beta Was this translation helpful? Give feedback.
All reactions