You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi in our final code we calculated Regularization Loss and then sum it with
data_Loss. Now the problem is I think we didn't add our total Loss as a final output of the last layer and then multiply partial gradients with it and instead what we did is just propagated Loss like when we don't have regularization loss.
(In summary, we didn't add the effect of Regularization Loss in forward pass phase).
And I think we should use this
loss_activation.backward(Loss, y)
instead of
loss_activation.backward(loss_activation.output, y).
Would be glad to receive your opinion.
Best Regards
The text was updated successfully, but these errors were encountered:
Hi in our final code we calculated Regularization Loss and then sum it with
data_Loss. Now the problem is I think we didn't add our total Loss as a final output of the last layer and then multiply partial gradients with it and instead what we did is just propagated Loss like when we don't have regularization loss.
(In summary, we didn't add the effect of Regularization Loss in forward pass phase).
And I think we should use this
loss_activation.backward(Loss, y)
instead of
loss_activation.backward(loss_activation.output, y).
Would be glad to receive your opinion.
Best Regards
The text was updated successfully, but these errors were encountered: