Navigation

Operators and Keywords

Function List:

C++ API

: fminunc (fcn, x0)
: fminunc (fcn, x0, options)
: [x, fval, info, output, grad, hess] = fminunc (fcn, …)

Solve an unconstrained optimization problem defined by the function fcn.

fcn should accept a vector (array) defining the unknown variables, and return the objective function value, optionally with gradient. fminunc attempts to determine a vector x such that fcn (x) is a local minimum.

x0 determines a starting guess. The shape of x0 is preserved in all calls to fcn, but otherwise is treated as a column vector.

options is a structure specifying additional options. Currently, fminunc recognizes these options: "FunValCheck", "OutputFcn", "TolX", "TolFun", "MaxIter", "MaxFunEvals", "GradObj", "FinDiffType", "TypicalX", "AutoScaling".

If "GradObj" is "on", it specifies that fcn, when called with two output arguments, also returns the Jacobian matrix of partial first derivatives at the requested point. TolX specifies the termination tolerance for the unknown variables x, while TolFun is a tolerance for the objective function value fval. The default is 1e-7 for both options.

For a description of the other options, see optimset.

On return, x is the location of the minimum and fval contains the value of the objective function at x.

info may be one of the following values:

1

Converged to a solution point. Relative gradient error is less than specified by TolFun.

2

Last relative step size was less than TolX.

3

Last relative change in function value was less than TolFun.

0

Iteration limit exceeded—either maximum number of algorithm iterations MaxIter or maximum number of function evaluations MaxFunEvals.

-1

Algorithm terminated by OutputFcn.

-3

The trust region radius became excessively small.

Optionally, fminunc can return a structure with convergence statistics (output), the output gradient (grad) at the solution x, and approximate Hessian (hess) at the solution x.

Application Notes: If the objective function is a single nonlinear equation of one variable then using fminbnd is usually a better choice.

The algorithm used by fminunc is a gradient search which depends on the objective function being differentiable. If the function has discontinuities it may be better to use a derivative-free algorithm such as fminsearch.

See also: fminbnd, fminsearch, optimset.

Package: octave