-
Notifications
You must be signed in to change notification settings - Fork 214
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training hyperparameters #26
Comments
See the readme: https://github.com/openai/supervised-reptile/blob/master/README.md#reproducing-training-runs which describes how to run the experiments by specifying the correct arguments |
In the readme file train-shots is 10, however, it seems that this should be 5-way 1-shot. Am I missing something? transductive 1-shot 5-way Omniglot.python -u run_omniglot.py --shots 1 --inner-batch 10 --inner-iters 5 --meta-step 1 --meta-batch 5 --meta-iters 100000 --eval-batch 5 --eval-iters 50 --learning-rate 0.001 --meta-step-final 0 --train-shots 10 --checkpoint ckpt_o15t --transductive |
Does this mean that during meta-learning we are training with 5-way 10-shot? but for the test, we evaluate on 5-way, 1-shot? |
@siavash-khodadadeh I have the same question. And I notice that the paper said
It is a different setting with maml in evalution, maml use k_qry=15 (that is 15 examples for each class) to evaluate itself. It seems like the comparison in the experiment is unfair? |
Hello,
I'm hoping to confirm that the hyperparameters specified in your paper are correct. Specifically, for miniimagenet, 100k meta steps were taken during training? I ask because it seems some of the default values in the code are different.
The text was updated successfully, but these errors were encountered: