Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory requirement in TF #22

Open
olegarch opened this issue Apr 4, 2019 · 2 comments
Open

Memory requirement in TF #22

olegarch opened this issue Apr 4, 2019 · 2 comments

Comments

@olegarch
Copy link

olegarch commented Apr 4, 2019

I was using batch of size 32 and once I switched to lovasz loss I had to decrease batch size to 8 to be able to train the same model. Otherwise OOM error is thrown.
Is it expected?

@bermanmaxim
Copy link
Owner

Hello,
Normally the memory usage should be around the same, but I only experimented with PyTorch. Maybe the sort algorithm is taking more memory than necessary. I might investigate some time in the future if I have time.

@marcocaccin
Copy link

Can confirm that Lovasz loss is eating up a lot more memory than, say, cross-entropy. This is on TF 2.1. I'll try to see if there's any low-hanging fruits for this contributions-welcome tag ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants