-
Notifications
You must be signed in to change notification settings - Fork 312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parameter Constraint at ALEBO #424
Comments
Hi, @pzmijewski, I get a sense that this might be a bug actually. Before the error, do you see this log by any chance: https://github.com/facebook/Ax/blob/master/ax/service/managed_loop.py#L186? If so, what exception does it log? cc @bletham to respond re: support for constraints for ALEBO in general. |
Hi @lena-kashtelyan |
@pzmijewski as you have discovered, the ALEBO implementation does not currently support parameter constraints. In principal it can - the acquisition function optimization is anyway being done with a whole bunch of constraints to represent the high-dimensional box bounds, so adding a few more parameter constraints really wouldn't be a big deal. The challenges comes with generating the embedding. With parameter constraints, a totally random embedding might be a really bad choice - in fact there could be a chance of generating an embedding that is entirely infeasible with respect to the parameter constraint! (something that won't happen with box bounds alone since the embedding is centered wrt the box bounds). For example: consider a 2-d problem with a 1-d embedding. The box-bounds are the usual (for embedding BO) [-1, 1] and there is a parameter constraint of x_1 + x_2 >= 0.5 . So if we have parameter constraints, we need to ensure that the embedding is generated in a way that produces an embedding with a high proportion of feasible volume. This is something I've thought about a little, but not enough to be able to say what the best approach would be, or to have implemented anything which is why it is just disallowed in the current ALEBO implementation. One natural thing that could be done would be to generate a whole bunch of random embeddings, estimate the feasible volume of each of them (for instance using Monte Carlo sampling), and then choose the embedding with the highest feasible volume. But this is a pretty big change in how the embedding is being generated so I think we'd want to check empirically that things are still working. An alternative approach would be to find an interior point in the high-dimensional feasible space (e.g. something like [0.75, 0.75] in the example above; in general an approximate centroid seems ideal), and then generate an embedding that guarantees that point is contained in the embedding but is otherwise random (there are d degrees of freedom in the d-dimensional embedding, and we would use one of them to force the embedding to contain a point interior to the feasible set, and then the other d-1 would be set randomly as usual). In any case, I think there is a bit of work to be done to figure out what approach actually works well, and that work hasn't been done yet. From the point of view of the code, not much needs to be done to add support for parameter constraints. The CenteredUnitX transform needs to add an operation to transform the constraints to the [-1, 1] hypercube (note that the constraints stay linear after this transformation). That would happen here: Ax/ax/modelbridge/transforms/centered_unit_x.py Lines 65 to 68 in d5b7de1
and would be very similar to the code that we already have for transforming constraints to the [0, 1] hypercube here, just with minor algebraic differences: Ax/ax/modelbridge/transforms/unit_x.py Lines 68 to 78 in d5b7de1
After that, the parameter constraints would successfully be passed along to the ALEBO model, which then would need minor modification to use them. Obviously this would have to go: Line 640 in d5b7de1
and in its place we would need to convert the constraint from the high-dimensional space (on x) to being a constraint in the embedding (on y) (since it's a linear embedding, the constraint would still be linear in the embedding). We would then add the linear constraint(s) to the set of constraints here: Lines 643 to 646 in d5b7de1
and that's all that would be required on the ALEBO model side. The initializer would also need to correctly handle the constraint; this line would be removed: Ax/ax/models/random/alebo_initializer.py Line 63 in d5b7de1
The (high-dimensional) parameter constraint would be passed along here when generating Sobol points in the high-dimensional space: Ax/ax/models/random/alebo_initializer.py Lines 65 to 70 in d5b7de1
and then (a couple steps later) when we filter to points that respect the box bounds, we would also add the (high-dimensional) parameter constraint: Ax/ax/models/random/alebo_initializer.py Lines 78 to 79 in d5b7de1
But given the open questions around generating the embedding in the presence of parameter constraints, we haven't yet pushed on trying to add support for this (we also haven't had a need yet on the application side). |
@bletham Thank you! Your comment helped a lot! Maybe i´m going to implement it on my own later, but currently the constraint is not essential for my optimization so i will focus on other things first. |
We will now be tracking wishlist items / feature requests in a master issue for improved visibility: #566. Of course please feel free to still open new feature requests issues; we'll take care of thinking them through and adding them to the master issue. |
Hey,
i´m trying to implement some parameter constraints in my ALEBO optimization.
When adding the constraint the following Value Error occurs before the first trial evaluation starts. Without the constraint the optimization is running without any problems.
The error occurs in the optimize() of the managed loop.
Is there any possibility to implement parameter constraints in ALEBO?
The text was updated successfully, but these errors were encountered: