User Guide

4
-
14 Understandin
g
Optimization Principles and Options
The optimizer uses gradient-based optimization algorithms that
use a finite difference method to approximate the gradients
(gradients are not known analytically). To implement finite
differencing, the optimizer:
1
Perturbs each parameter in turn from its current value by an
amount h.
2
Evaluates the function at the perturbed value.
3
Subtracts the old function value from the new.
4
Divides the result by h.
Note
There is a tradeoff. If h is too small, the difference
in function values is unreliable due to numerical
inaccuracies. However if h is too lar
g
e, the result is
a poor approximation to the true
g
radient.