Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

setting pretraining learning rate in command line interface #1905

Open
2533245542 opened this issue Jan 10, 2025 · 1 comment
Open

setting pretraining learning rate in command line interface #1905

2533245542 opened this issue Jan 10, 2025 · 1 comment
Labels
question Further information is requested

Comments

@2533245542
Copy link

I tried to set it by

litgpt pretrain --config pythia14m.yaml --optimizer.class_path torch.optim.AdamW --optimizer.init_args.lr 1e-5

but not working

@2533245542 2533245542 added the question Further information is requested label Jan 10, 2025
@Kiwy3
Copy link

Kiwy3 commented Jan 17, 2025

Hello,

To define optimizer parameters, the --optimizer argument takes dictionnary. I had the same problem, you can use for instance
-- optimizer '{'class_path': 'torch.optim.AdamW', 'init_args': {'lr': 0.001, 'weight_decay': 1e-05, 'betas': [0.9, 0.999]}} '

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants