Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chapter14 -Problem in Regularization_loss #14

Open
Ali-Mohammadi65748 opened this issue Jan 17, 2022 · 0 comments
Open

chapter14 -Problem in Regularization_loss #14

Ali-Mohammadi65748 opened this issue Jan 17, 2022 · 0 comments

Comments

@Ali-Mohammadi65748
Copy link

Ali-Mohammadi65748 commented Jan 17, 2022

Hi in our final code we calculated Regularization Loss and then sum it with
data_Loss. Now the problem is I think we didn't add our total Loss as a final output of the last layer and then multiply partial gradients with it and instead what we did is just propagated Loss like when we don't have regularization loss.
(In summary, we didn't add the effect of Regularization Loss in forward pass phase).
And I think we should use this
loss_activation.backward(Loss, y)
instead of
loss_activation.backward(loss_activation.output, y).

Would be glad to receive your opinion.

Best Regards

@Ali-Mohammadi65748 Ali-Mohammadi65748 changed the title chapter14 chapter14 -Problem in Regularization_loss Jan 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant