Replies: 1 comment
-
@cymdhx depends on your needs & constraints.. unfortunately both are pretty slow as implemented at the Python API level. Now that I'm working with PyTorch XLA I plan to try them again since the compiled XLA + TPU combo might remove some of the speed penalty relative to BatchNorm and GroupNorm and their optimized kernels. Sample is closer to GroupNorm in terms of what the normalizing stats are computed over, and batch is closer to BatchNorm and will have similar concerns re batch sizing and train/eval mode switch (to running stats). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to judge which evonorm sample2d or evonorm batch2d to use
Beta Was this translation helpful? Give feedback.
All reactions