Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is the learning rate 0.0001 (default) is good or the (0.00001) is good? #28

Closed
man0007 opened this issue Apr 8, 2020 · 2 comments
Closed

Comments

@man0007
Copy link

man0007 commented Apr 8, 2020

HI,

I was trying to tweak the learning_rate and dropout parameters for the handwriting_line_recognition.py model.

Since there is no much change in the loss for changing the dropout parameters (20%, 35%, 50%) i'm just fixing the default one.

But for the learning rate change from 0.0001 to 0.00001 there is a huge increase in the stability of the model as plotted below. (training loss is equivalent to the test loss)

plotted graph image: https://prnt.sc/rv6lzm

graph_label notations:

lr-e5 => learning_rate = 0.00001
lr-e4 => learning_rate = 0.0001

-> Bottom two lines are the train and test loss calculation for the 0.0001 learning_rate parameters and all above lines are plotted for 0.00001. We could see the bottom two lines are not stable where as the other lines are very stable (training loss is equivalent to the test loss)

Since the lr 0.00001 is better than 0.0001, can we fix 0.00001 as default or do we face any other problem if we use this new lr rate?

Please advice.

Thanks,
Anand.

@jonomon
Copy link
Contributor

jonomon commented Apr 8, 2020

From the plot it seems like the loss is smaller for the default LR compared to LR you suggested.

@man0007
Copy link
Author

man0007 commented Apr 11, 2020

Yeah that's true my assumption is that if I would increase the Epocs it may converge as equal to the lr=0.0001, by this we can achieve a stability and reduced loss.

@jonomon jonomon closed this as completed May 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants