Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameter Constraint at ALEBO #424

Closed
pehzet opened this issue Nov 10, 2020 · 5 comments
Closed

Parameter Constraint at ALEBO #424

pehzet opened this issue Nov 10, 2020 · 5 comments
Labels
enhancement New feature or request wishlist Long-term wishlist feature requests

Comments

@pehzet
Copy link

pehzet commented Nov 10, 2020

Hey,

i´m trying to implement some parameter constraints in my ALEBO optimization.
When adding the constraint the following Value Error occurs before the first trial evaluation starts. Without the constraint the optimization is running without any problems.
The error occurs in the optimize() of the managed loop.

  File "C:\code\black-box-opt\black-box-opt-server\experiment_runner.py", line 365, in run
    generation_strategy=alebo_strategy)
  File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\managed_loop.py", line 246, in optimize
    parameterization, values = loop.get_best_point()
  File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\managed_loop.py", line 200, in get_best_point
    experiment=self.experiment
  File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\utils\best_point.py", line 52, in get_best_raw_objective_point
    raise ValueError("Cannot identify best point if experiment contains no data.")

Is there any possibility to implement parameter constraints in ALEBO?

@lena-kashtelyan
Copy link
Contributor

Hi, @pzmijewski, I get a sense that this might be a bug actually. Before the error, do you see this log by any chance: https://github.com/facebook/Ax/blob/master/ax/service/managed_loop.py#L186? If so, what exception does it log?

cc @bletham to respond re: support for constraints for ALEBO in general.

@pehzet
Copy link
Author

pehzet commented Nov 12, 2020

Hi @lena-kashtelyan
this is the message i get:
[INFO 11-12 10:46:16] ax.service.managed_loop: Started full optimization with 15 steps.
[INFO 11-12 10:46:16] ax.service.managed_loop: Running optimization trial 1...
[ERROR 11-12 10:46:16] ax.service.managed_loop: Encountered exception during optimization:
Traceback (most recent call last):
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\managed_loop.py", line 178, in full_run
self.run_trial()
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\utils\common\executils.py", line 98, in actual_wrapper
return func(*args, **kwargs)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\managed_loop.py", line 156, in run_trial
experiment=self.experiment
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\generation_strategy.py", line 403, in gen
experiment=experiment, num_generator_runs=1, data=data, n=n, **kwargs
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\generation_strategy.py", line 446, in _gen_multiple
self._set_or_update_model(data=data)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\generation_strategy.py", line 529, in _set_or_update_model
self._set_or_update_current_model(data=data)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\generation_strategy.py", line 535, in _set_or_update_current_model
self._set_current_model(data=data)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\generation_strategy.py", line 594, in _set_current_model
self._set_current_model_from_factory_function(data=data, **model_kwargs)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\generation_strategy.py", line 649, in _set_current_model_from_factory_function
**kwargs,
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\strategies\alebo.py", line 36, in get_ALEBOInitializer
transforms=ALEBO_X_trans, # pyre-ignore
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\base.py", line 153, in init
transform_configs=transform_configs,
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\base.py", line 193, in _transform_data
search_space = t_instance.transform_search_space(search_space)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\modelbridge\transforms\centered_unit_x.py", line 68, in transform_search_space
raise ValueError("Does not support parameter constraints")
ValueError: Does not support parameter constraints

@bletham
Copy link
Contributor

bletham commented Nov 12, 2020

@pzmijewski as you have discovered, the ALEBO implementation does not currently support parameter constraints.

In principal it can - the acquisition function optimization is anyway being done with a whole bunch of constraints to represent the high-dimensional box bounds, so adding a few more parameter constraints really wouldn't be a big deal.

The challenges comes with generating the embedding. With parameter constraints, a totally random embedding might be a really bad choice - in fact there could be a chance of generating an embedding that is entirely infeasible with respect to the parameter constraint! (something that won't happen with box bounds alone since the embedding is centered wrt the box bounds).

For example: consider a 2-d problem with a 1-d embedding. The box-bounds are the usual (for embedding BO) [-1, 1] and there is a parameter constraint of x_1 + x_2 >= 0.5 .
Suppose I randomly generate the embedding B = [-1, 1]. It's easy to see that within this embedding, every point violates the constraint x_1 + x_2 >= 0.5. If you sketch this out, you'll be able to see that with this parameter constraint there is actually a relatively high chance of generating an embedding that is entirely infeasible. Which would obviously be bad; in the language of the paper, we know that P_opt is 0.

So if we have parameter constraints, we need to ensure that the embedding is generated in a way that produces an embedding with a high proportion of feasible volume. This is something I've thought about a little, but not enough to be able to say what the best approach would be, or to have implemented anything which is why it is just disallowed in the current ALEBO implementation. One natural thing that could be done would be to generate a whole bunch of random embeddings, estimate the feasible volume of each of them (for instance using Monte Carlo sampling), and then choose the embedding with the highest feasible volume. But this is a pretty big change in how the embedding is being generated so I think we'd want to check empirically that things are still working. An alternative approach would be to find an interior point in the high-dimensional feasible space (e.g. something like [0.75, 0.75] in the example above; in general an approximate centroid seems ideal), and then generate an embedding that guarantees that point is contained in the embedding but is otherwise random (there are d degrees of freedom in the d-dimensional embedding, and we would use one of them to force the embedding to contain a point interior to the feasible set, and then the other d-1 would be set randomly as usual). In any case, I think there is a bit of work to be done to figure out what approach actually works well, and that work hasn't been done yet.

From the point of view of the code, not much needs to be done to add support for parameter constraints. The CenteredUnitX transform needs to add an operation to transform the constraints to the [-1, 1] hypercube (note that the constraints stay linear after this transformation). That would happen here:

for c in search_space.parameter_constraints:
for p_name in c.constraint_dict:
if p_name in self.bounds:
raise ValueError("Does not support parameter constraints")

and would be very similar to the code that we already have for transforming constraints to the [0, 1] hypercube here, just with minor algebraic differences:
for c in search_space.parameter_constraints:
constraint_dict: Dict[str, float] = {}
bound = float(c.bound)
for p_name, w in c.constraint_dict.items():
# p is RangeParameter, but may not be transformed (Int or log)
if p_name in self.bounds:
l, u = self.bounds[p_name]
constraint_dict[p_name] = w * (u - l)
bound -= w * l
else:
constraint_dict[p_name] = w

After that, the parameter constraints would successfully be passed along to the ALEBO model, which then would need minor modification to use them. Obviously this would have to go:
assert linear_constraints is None

and in its place we would need to convert the constraint from the high-dimensional space (on x) to being a constraint in the embedding (on y) (since it's a linear embedding, the constraint would still be linear in the embedding). We would then add the linear constraint(s) to the set of constraints here:

Ax/ax/models/torch/alebo.py

Lines 643 to 646 in d5b7de1

# Setup constraints
A = torch.cat((self.Binv, -self.Binv))
b = torch.ones(2 * self.Binv.shape[0], 1, dtype=self.dtype, device=self.device)
linear_constraints = (A, b)

and that's all that would be required on the ALEBO model side.

The initializer would also need to correctly handle the constraint; this line would be removed:

assert linear_constraints is None

The (high-dimensional) parameter constraint would be passed along here when generating Sobol points in the high-dimensional space:
# Do gen in the high-dimensional space.
X01, w = super().gen(
n=self.nsamp,
bounds=[(0.0, 1.0)] * self.Q.shape[0],
model_gen_options={"max_rs_draws": self.nsamp},
)

and then (a couple steps later) when we filter to points that respect the box bounds, we would also add the (high-dimensional) parameter constraint:
# Filter out to points in [-1, 1]^D
X = X[(X >= -1.0).all(axis=1) & (X <= 1.0).all(axis=1)]

But given the open questions around generating the embedding in the presence of parameter constraints, we haven't yet pushed on trying to add support for this (we also haven't had a need yet on the application side).

@pehzet
Copy link
Author

pehzet commented Nov 13, 2020

@bletham Thank you! Your comment helped a lot! Maybe i´m going to implement it on my own later, but currently the constraint is not essential for my optimization so i will focus on other things first.

@lena-kashtelyan lena-kashtelyan added wishlist Long-term wishlist feature requests enhancement New feature or request labels Nov 16, 2020
@lena-kashtelyan
Copy link
Contributor

We will now be tracking wishlist items / feature requests in a master issue for improved visibility: #566. Of course please feel free to still open new feature requests issues; we'll take care of thinking them through and adding them to the master issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request wishlist Long-term wishlist feature requests
Projects
None yet
Development

No branches or pull requests

3 participants