Replies: 1 comment
-
Would it be as simple as creating candidates for thompson sampling across the global search space, then sample according to a batch_size plus some extra margin, then discard those which lie within the trust regions and selecting the best from the remaining according to batch size? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi Botorch community,
As mentioned in previous Q&As (sorry for the spam), I am testing some setups using SEBO for sparsity optimization using L1_norm and experimenting with adapting searchspace constraints between iterations.
A concrete example would be a 10-parameter range search space. At e.g. iteration 3, a candidate is found which evaluates to below a preset threshold which would indicate its a promising candidate to further explore. This triggers an upper constraint adaptation on all parameters in search space, to the value of the candidate (perhaps plus some margin). This would effectively drive the exploration of SEBO downwards and explore this candidate more narrowly.
While this approach appears useful in simulation, it does result in the optimization as a whole to become narrow towards that candidate. An alternate option would be to split the candidate out of the search space to further optimize its sparsity and performance, while still keeping options open to discover other candidates. I suppose this could effectively be done with trust regions, one per candidate and one comprising the remaining search space, where candidate regions are further subjected to adapted constraints.
What I am struggling with though is, suppose a promising candidate is comprised of parameter 0 and parameter 1, out of 10. It may well be that parameter 0 or 1 could be used for another candidate in combination with parameter 2-9. It would be a simple stupid solution to split out parameters 2-9 and search them for candidates, but that would also mean they cannot be included in them in any way. Is there an approachable way to generate candidates in the global search space using all parameters, but avoiding those which include or rely heavily on both parameter 0 and 1? (i realize there is ambiguity in this last part).
I have yet to explore the low level workings of the botorch acquisition API, this would be an opportunity to do so but it would be nice with a point in the right direction :)
Cheers
Beta Was this translation helpful? Give feedback.
All reactions