evaluations. N can now be used as a convergence criterion. The iteration process is
terminated when a total of N function evaluations have been performed. Note that N may
take on fractional values and should be rounded up to the next higher integer.
Although the golden section algorithm has been developed to determine the minimum
of a one-variable function, the maximum of a function can be determined simply by
minimizing the negative of the function.
Gradient Techniques
The rate of convergence of an optimization technique for a particular problem is
dependent upon the initial parameters selected. If poor initial parameters are chosen, the
optimization technique converges very slowly and is rendered useless. Depending on the
technique chosen the initial parameters will dictate the speed at which a solution is
determined as well as the validity of the final solution.
A combination of optimization techniques may be applied initially. A slow algorithm
may be tried and once approaching the minimum, switch to a technique that is much
faster. Initially, a decision is made as to which direction to proceed. Then, a decision is
made on how far to proceed in that direction. The latter, is determined by a set of rules
and steepest descent is an example of this.
Steepest Descent
For any optimization procedure an initial guess x must be made for the parameter
value. This guess can then be modified to x+6x in order to reduce an error function, E(x).
Here x is a vector that contains the n design parameters and the objective is to find the
values of these parameters that minimizes the objective function F(x). The steepest
descent technique chooses 6x so as to be traveling in a "downhill" direction (i.e. direction