NLopt Algorithms

From AbInitio

Revision as of 04:48, 12 November 2008; Stevenj (Talk | contribs)
(diff) ←Older revision | Current revision | Newer revision→ (diff)
Jump to: navigation, search
NLopt
Download
Release notes
FAQ
NLopt manual
Introduction
Installation
Tutorial
Reference
Algorithms
License and Copyright

NLopt includes implementations of a number of different optimization algorithms. These algorithms are listed below, including links to the original source code (if any) and citations to the relevant articles in the literature (see Citing NLopt).

Even where I found available free/open-source code for the various algorithms, I modified the code at least slightly (and in some cases noted below, substantially) for inclusion into NLopt. I apologize in advance to the authors for any new bugs I may have inadvertantly introduced into their code.

Contents

Nomenclature

Each algorithm in NLopt is identified by a named constant, which is passed to the NLopt routines in the various languages in order to select a particular algorithm. These constants are of the form NLOPT_{G,L}{N,D}_xxxx, where G/L denotes global/local optimization and N/D denotes derivative-free/gradient-based algorithms, respectively.

For example, the NLOPT_LN_COBYLA constant refers to the COBYLA algorithm (described below), which is a local (L) derivative-free (N) optimization algorithm.

Many of the algorithms have several variants, which are grouped together below.

Global optimization

DIRECT and DIRECT-L

DIRECT is the DIviding RECTangles algorithm for global optimization, described in:

  • D. R. Jones, C. D. Perttunen, and B. E. Stuckmann, "Lipschitzian optimization without the lipschitz constant," J. Optimization Theory and Applications, vol. 79, p. 157 (1993).

and DIRECT-L is the "locally biased" variant proposed by:

  • J. M. Gablonsky and C. T. Kelley, "A locally-biased form of the DIRECT algorithm," J. Global Optimization, vol. 21 (1), p. 27-37 (2001).

These is are deterministic-search algorithms based on systematic division of the search domain into smaller and smaller hyperrectangles. The Gablonsky version makes the algorithm "more biased towards local search" so that it is more efficient for functions without too many local minima. NLopt contains several implementations of both of these algorithms. I would tend to try NLOPT_GN_DIRECT_L first; YMMV.

First, it contains a from-scratch re-implementation of both algorithms, specified by the constants NLOPT_GN_DIRECT and NLOPT_GN_DIRECT_L, respectively.

Second, there is a slightly randomized variant of DIRECT-L, specified by NLOPT_GLOBAL_DIRECT_L_RAND, which uses some randomization to help decide which dimension to halve next in the case of near-ties.

The DIRECT and DIRECT-L algorithms start by rescaling the bound constraints to a hypercube, which gives all dimensions equal weight in the search procedure. If your dimensions do not have equal weight, e.g. if you have a "long and skinny" search space and your function varies at about the same speed in all directions, it may be better to use unscaled variants of these algorthms, which are specified as NLOPT_GLOBAL_DIRECT_NOSCAL, NLOPT_GLOBAL_DIRECT_L_NOSCAL, and NLOPT_GLOBAL_DIRECT_L_RAND_NOSCAL, respectively. These implementations have a number of hard-coded limitations on things like the number of function evaluations; I removed several of these limitations, but some remain.

Finally, NLopt also includes separate implementations based on the original Fortran code by Gablonsky et al., which are specified as NLOPT_GN_ORIG_DIRECT and NLOPT_GN_ORIG_DIRECT_L.

Most of the above algorithms only handle bound constraints, and in fact require finite bound constraints (they are not applicable to unconstrained problems). They do not handle arbitrary nonlinear constraints. However, the ORIG versions by Gablonsky et al. include some support for arbitrary constraints, if you represent constraint violations by having your objective function return NaN or infinity (HUGE_VAL, in C).

Controlled Random Search (CRS) with local mutation

Multi-level single-linkage (MLSL)

StoGO

Local derivative-free optimization

COBYLA (Constrained Optimization BY Linear Approximations)

NEWUOA + bound constraints

PRAXIS (PRincipal AXIS)

Nelder-Mead Simplex

Sbplx (based on Subplex)

Local gradient-based optimization

MMA (Method of Moving Asymptotes)

Low-storage BFGS

Preconditioned truncated Newton

Shifted limited-memory variable-metric

Personal tools