Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do I fill in the loss of a hyperparam causing divergence? #57

Open
cuihenggang opened this issue Jul 20, 2016 · 3 comments
Open

How do I fill in the loss of a hyperparam causing divergence? #57

cuihenggang opened this issue Jul 20, 2016 · 3 comments

Comments

@cuihenggang
Copy link

Hello,

I'm using spearmint-lite to search for a good step size and regularization term for my machine learning application. If I understand correctly, I should fill in the training loss (if I'm not interested in the validation loss) in the result.dat file after running with each set of step size and regularization term.

However, I find sometimes my application diverges (with nan or inf loss) with the step size and regularization term proposed by Spearmint, and Spearmint does not accept nan or inf values as the result. So in that case, how do I tell Spearmint that this set of step size and regularization term leads to divergence?

Thank you!
Cui

@JasperSnoek
Copy link
Owner

You should mark those as "constraint violations" (take a look at this paper
for details: http://arxiv.org/abs/1403.5607). This is implemented in
GPConstrainedEIChooser.py
https://github.com/JasperSnoek/spearmint/blob/master/spearmint/spearmint/chooser/GPConstrainedEIChooser.py

Hope that helps!

On Wed, Jul 20, 2016 at 4:10 PM, Henggang Cui [email protected]
wrote:

Hello,

I'm using spearmint-lite to search for a good step size and regularization
term for my machine learning application. If I understand correctly, I
should fill in the training loss (if I'm not interested in the validation
loss) in the result.dat file after running with each set of step size and
regularization term.

However, I find sometimes my application diverges (with nan or inf loss)
with the step size and regularization term proposed by Spearmint, and
Spearmint does not accept nan or inf values as the result. So in that case,
how do I tell Spearmint that this set of step size and regularization term
leads to divergence?

Thank you!
Cui


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#57, or mute the thread
https://github.com/notifications/unsubscribe-auth/AC7L286k2KfILz7omndELpRQF5hMPUN2ks5qXoCvgaJpZM4JRKKl
.

@cuihenggang
Copy link
Author

cuihenggang commented Jul 22, 2016

Hi,

Thank you for pointing out the constrained chooser. I am now using the GPConstrainedEIChooser. However, I get the following exception. What could be wrong?

Traceback (most recent call last):
File "spearmint/spearmint-lite/spearmint-lite.py", line 219, in
main()
File "spearmint/spearmint-lite/spearmint-lite.py", line 84, in main
main_controller(options, args)
File "spearmint/spearmint-lite/spearmint-lite.py", line 190, in main_controller
np.nonzero(grid_idx == 0)[0])
File "/usr/local/lib/python2.7/dist-packages/chooser/GPConstrainedEIChooser.py", line 240, in next
durations[complete])
File "/usr/local/lib/python2.7/dist-packages/chooser/GPConstrainedEIChooser.py", line 161, in _real_init
self.constraint_gain = state['constraint_gain']
KeyError: 'constraint_gain'

Thank you very much!

Cui

@cuihenggang
Copy link
Author

I find the 'constraint_gain' is not dumped to the state file:

cPickle.dump({ 'dims' : self.D,

Thank you,
Cui

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants