
 Description:
minFunc is a Matlab function for unconstrained optimization of differentiable realvalued multivariate functions using linesearch methods. It uses an interface very similar to the Matlab Optimization Toolbox function fminunc, and can be called as a replacement for this function. On many problems, minFunc requires fewer function evaluations to converge than fminunc (or minimize.m). Further it can optimize problems with a much larger number of variables (fminunc is restricted to several thousand variables), and uses a line search that is robust to several common function pathologies.
The default parameters of minFunc call a quasiNewton strategy, where limitedmemory BFGS updates with ShannoPhua scaling are used in computing the step direction, and a bracketing linesearch for a point satisfying the strong Wolfe conditions is used to compute the step direction. In the line search, (safeguarded) cubic interpolation is used to generate trial values, and the method switches to an Armijo backtracking line search on iterations where the objective function enters a region where the parameters do not produce a real valued output (i.e. complex, NaN, or Inf).
Some highlights of the nondefault features present in minFunc:
Step directions can be computed based on: Exact Newton (requires usersupplied Hessian), full quasiNewton approximation (uses a dense Hessian approximation), limitedmemory BFGS (uses a lowrank Hessian approximation  default), (preconditioned) Hessianfree Newton (uses Hessianvector products), (preconditioned) conjugate gradient (uses only previous step and a vector beta), Barzilai and Borwein (uses only previous step), or (cyclic) steepest descent. Step lengths can be computed based on either the (nonmonotone) Armijo or Wolfe conditions, and trial values can be generated by either backtracking/bisection, or polynomial interpolation. Several strategies are available for selecting the initial trial value. Numerical differentiation and derivative checking are available, including an option for automatic differentiation using complexstep differentials (if the objective function code handles complex inputs). Most methods have usermodifiable parameters, such as the number of corrections to store for LBFGS, modification options for Hessian matrices that are not positivedefinite in the pure Newton method, choice of preconditioning and Hessianvector product functions for the Hessianfree Newton method, choice of update method;scaling;preconditioning for the nonlinear conjugate gradient method, the type of Hessian approximation to use in the quasiNewton iteration, number of steps to look back for the nonmonotone Armijo condition, the parameters of the line search algorithm, the parameters of the termination criteria, etc.
Usage minFunc uses an interface very similar to Matlab's fminunc. If you currently call 'fminunc(@myFunc,x0,options,myFuncArg1,myFuncArg2)', then you can use minFunc instead by simply replacing 'fminunc' with 'minFunc'. Note that by default minFunc assumes that the gradient is supplied, unless the 'numDiff' option is set to 1 (for forwarddifferencing) or 2 (for centraldifferencing). minFunc supports many of the same parameters as fminunc (but not all), but has some differences in naming and also has many parameters that are not available for fminunc. 'help minFunc' will give a list of parameters and their explanation.
 Changes to previous version:
Initial Announcement on mloss.org.
 BibTeX Entry: Download
 URL: Project Homepage
 Supported Operating Systems: Platform Independent
 Data Formats: Matlab
 Tags: Optimization
 Archive: download here
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.