-
Notifications
You must be signed in to change notification settings - Fork 334
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How do I fill in the loss of a hyperparam causing divergence? #57
Comments
You should mark those as "constraint violations" (take a look at this paper Hope that helps! On Wed, Jul 20, 2016 at 4:10 PM, Henggang Cui [email protected]
|
Hi, Thank you for pointing out the constrained chooser. I am now using the GPConstrainedEIChooser. However, I get the following exception. What could be wrong? Traceback (most recent call last): Thank you very much! Cui |
I find the 'constraint_gain' is not dumped to the state file:
Thank you, |
Hello,
I'm using spearmint-lite to search for a good step size and regularization term for my machine learning application. If I understand correctly, I should fill in the training loss (if I'm not interested in the validation loss) in the result.dat file after running with each set of step size and regularization term.
However, I find sometimes my application diverges (with nan or inf loss) with the step size and regularization term proposed by Spearmint, and Spearmint does not accept nan or inf values as the result. So in that case, how do I tell Spearmint that this set of step size and regularization term leads to divergence?
Thank you!
Cui
The text was updated successfully, but these errors were encountered: