sequence of temporary solutions, until an approximate equilibrium is achieved in which
the solution space occupancy is described by the Gibbs distribution (Eq. 2.1-2). Once this
approximate equilibrium is achieved, the control parameter value is reduced and the solu-
tion sequence is extended until equilibrium is achieved at the new control parameter
value. This process is repeated until some termination condition (e.g. minimum control
parameter value) is attained. The current solution at termination is then accepted as the
solution to the optimization problem.
It is noted in passing that simulated annealing always involves minimizing a cost
functional, never maximizing a reward. However, this causes no loss of generality
because any combinatorial optimization problem can be translated into an equivalent
minimization problem.
2.4 Theoretical Foundations of Simulated Annealing
The evolution of the search sequence of a simulated annealing algorithm as out-
lined, in which each succeeding solution in the sequence is determined stochastically
based upon the current solution, suggests that the algorithm behavior can be described as
a Markov chain. Indeed it can, and all of the known convergence results for simulated
annealing algorithms are derived from analysis of Markov chain models [LaAa87,
GeGe84, LuMe86, MiRo85, Rior58]. This subsection establishes a Markov chain model
to represent the simulated annealing algorithm and then employs it in reviewing the
development of the published convergence bounds. This development essentially follows
[LaAa87].
2.4.1 A Markov Chain Model of Simulated Annealing
Let a combinatorial optimization problem be represented by the pair (S, C) where S
is the problem's solution space and C is its cost function, and assume without loss of gen-
erality that the optimization problem requires minimization of C. Also, assume that S is
finite. Then, a simulated annealing algorithm for solving this problem can be
characterized by the quadruple (S,io,PT,T) where S is as defined above and where i0 e S