accumulate_grad warning when train convnext_tiny with pytorch_lightning and accumulate #1104
Unanswered
ray-lee-94
asked this question in
Q&A
Replies: 1 comment 2 replies
-
@VCBE123 you'll have to provide a lot more detail of your environment, hardware, model etc. I do have an idea what could cause it, but I can't reproduce on PyTorch 1.10+ on 3090 and there were some incorrect triggers for the warning in older versions of PT. It is also as it says, a warning and may not be impacting anything. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
[W accumulate_grad.h:170] Warning: grad and param do not obey the gradient layout contract. This is not an error, but may impair performance.
grad.sizes() = [96, 1, 7, 7], strides() = [49, 49, 7, 1]
param.sizes() = [96, 1, 7, 7], strides() = [49, 1, 7, 1] (function operator())
Beta Was this translation helpful? Give feedback.
All reactions