Simulated annealing is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For problems where finding an approximate global optimum is more important than finding a precise local optimum in a fixed amount of time, simulated annealing may be preferable to alternatives such as gradient descent.
- At the beginning, SA accepts worse and better solutions because the temperature is high;
- At the end of the optimization process, the temperature is very low and SA only accepts better solutions - behaves like hill climbing;
- The probability function of temperature is exponential:
$P(\Delta E, T) = e^{-\frac{f(s') - f(s)}{T}}$ .
- Make a random initial solution
$s$ ; - Make
$T = Tmax$ ; - Select a neighbor
$s'$ of$s$ ;- If
$f(s') < f(s)$ , then$s = s'$ ; - Else,
$s = s'$ with probability$e^{-\frac{f(s') - f(s)}{T}}$ ; - Repeat step 3 k times;
- If
- Temperature is updated:
$T = T * \alpha$ ; - If
$T < Tmin$ , then stop; else, go to step 3.
A well known approach to vary temperature is to use a exponentially-decreasing function:
getInitialSolution()
: returns a random solution;getNeighbor(solution)
: returns a random neighbor of the given solution;evaluate(solution)
: returns the fitness of the given solution;isOptimal(solution)
: returns true if the given solution is optimal.