Non-linear non-constrained minimization of a function
defined on an n
-dimensional Euclidean space, using the Nelder-Mead
method, also known as downhill simplex method. The basic idea about the
method can be obtained from
Nelder-Mead method.
It should be noted, that this method, although deterministic, is rather
a heuristic and therefore may converge to a local minima, not necessary
a global one. It is iterative optimization technique, which at each
step uses an information about the values of a function evaluated only
at n+1
points, arranged as a simplex in n
-dimensional space (hence
the second name of the method). At each step new point is chosen to
evaluate function at, obtained value is compared with previous ones and
based on this information simplex changes it's shape, slowly moving to
the local minimum. Thus this method is using only function values to
make decision, on contrary to, say, Nonlinear Conjugate Gradient method
(which is also implemented in OpenCV).
Algorithm stops when the number of function evaluations done exceeds
Termcrit.maxCount
, when the function values at the vertices of simplex
are within Termcrit.epsilon
range or simplex becomes so small that it
can enclosed in a box with Termcrit.epsilon
sides, whatever comes
first, for some defined by user positive integer Termcrit.maxCount
and
positive non-integer Termcrit.epsilon
.
NOTE: term criteria should meet the following conditions:
type == 'Count+EPS' && epsilon > 0 && maxCount > 0