Conditional Search Space #1835
zhenlan0426
started this conversation in
Ideas
Replies: 1 comment 6 replies
-
There are a few ways to do potentially do this. I haven't really come across a great general solution for this in the context of BO. http://proceedings.mlr.press/v108/ma20a/ma20a.pdf propose tree-structured covariance functions and ways of optimizing those. @dme65 tried this approach in the past, but it's a bit iffy and didn't work all that well on the real-world problems we tried it out on. @saitcakmak has some success with custom kernels for less general problems where the "conditionality" is only a single layer deep. |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am looking for ways to define Conditional Search Space. For example, a parameter that chooses architecture, and each architecture has its own parameters. The condition can be nested, so the most general case would be a tree-like search space.
Concetually, I can think of ways to define kernel for such space, so the GP should be feasible. But I am not sure how optimization would work in such space.
I find this issue in Ax. It seems that they flatten the search space. I dont think it would be a good idea since the optimizer will no longer know the sturcture of the search space if flattened. The opimizer will be evaluated at part of the space that is meaningless, e.g. parameter of different architecture than the choosen one. And the flattened space will be much bigger, hence harder to optimize.
Any reference to papers/code would be helpful. Cheers!
Beta Was this translation helpful? Give feedback.
All reactions