Skip to content

Optimization notes

Anand edited this page Sep 27, 2017 · 5 revisions

Common challenges

The following require a fair amount of domain knowledge and planning:

  • Choosing the parameter exploration space to contain salient parameters, but not too high dimensional
  • Defining the fitness function in a useful way that captures our requirement. (Mainly a problem when we have multiple fitnesses to evaluate, which is often the case)

Limitations of our optimization algorithms

  • Proper multi-objective optimization is not supported
  • They don't make efficient use of simulations that have already been run
  • They don't do well if the variability of fitness is too high *

* This can be mitigated significantly. See below.

The only way to solve the first two limitations would be to add appropriate algorithms that support those features.

Some solutions/work-arounds

  • Variability can largely be mitigated by fixing the random seed(s) for the Optimizee simulations, and making sure that each logical section of the code uses a different random number generator. (So that one can limit the phenomenon of small changes in parameters causing large changes in fitness due to a change in the number of random samples drawn). Random number generation should basically be siloed.
    • When using NEST, all things related to generation of the network and input noise should be done based on Python random number generators and not use the NEST random number generator (over which we have little control)
  • Decreasing the resolution/precision of variable space (discretezing it) even during search can reduce the search space and allow re-use of fitness results (for full precision, no two parameters will ever be exactly the same. For limited precision this is possible)