Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about batch size and learning rate #10

Open
tejas-gokhale opened this issue Oct 22, 2020 · 1 comment
Open

question about batch size and learning rate #10

tejas-gokhale opened this issue Oct 22, 2020 · 1 comment

Comments

@tejas-gokhale
Copy link

Hello authors,

I have access to a GPU server that can handle larger batch-sizes, say around 128 (or more). I believe this would reduce the training time ~4x, What would you recommend would be a good learning rate on higher batch-sizes? In your experience, is there a good heuristic that you follow in training GANs when it comes to adjusting batch-size and learning-rate ?

On a slightly unrelated note -- have you tried using Distributed Data Parallel to speed up training? We've been trying to use it, but are encountering weird errors, maybe you have some insights? If we are able to figure it out, I'd love to share the code with you and contribute it here.

Thanks!

@tohinz
Copy link
Owner

tohinz commented Oct 26, 2020

Hi, I have no direct experience with scaling of learning rates for larger batch sizes in GANs, but related work seems to suggest scaling the learning rate with the batch size might be a good start (see e.g. Section 4.7 in https://www.jmlr.org/papers/volume20/18-789/18-789.pdf). Popular approaches for that are to scale the learning rate linearly or by square root with the batch size. So I guess if you increase the batch size you should also increase the learning rate, but by how much exactly is somewhat difficult to predict.

Regarding distributed data parallel training I have not experimented with that but I'd be happy about your contributions and experiences.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants