NLopt Introduction

From AbInitio

(Difference between revisions)
Jump to: navigation, search
Revision as of 02:29, 12 November 2008 (edit)
Stevenj (Talk | contribs)
(Global versus local optimization)
← Previous diff
Revision as of 02:30, 12 November 2008 (edit)
Stevenj (Talk | contribs)
(Global versus local optimization)
Next diff →
Line 27: Line 27:
'''Global optimization''' is the problem of finding the feasible point '''x''' that minimizes the objective ''f''('''x''') over the ''entire'' feasible region. In general, this can be a ''very difficult'' problem, becoming exponentially harder as the number ''n'' of parameters increases. In fact, unless special information about ''f'' is known, it is not even possible to be certain whether one has found the true global optimum, because there might be a sudden dip of ''f'' hidden somewhere in the parameter space you haven't looked at yet. However, NLopt includes several global optimization algorithms that work well on reasonably well-behaved problems, if the dimension ''n'' is not too large. '''Global optimization''' is the problem of finding the feasible point '''x''' that minimizes the objective ''f''('''x''') over the ''entire'' feasible region. In general, this can be a ''very difficult'' problem, becoming exponentially harder as the number ''n'' of parameters increases. In fact, unless special information about ''f'' is known, it is not even possible to be certain whether one has found the true global optimum, because there might be a sudden dip of ''f'' hidden somewhere in the parameter space you haven't looked at yet. However, NLopt includes several global optimization algorithms that work well on reasonably well-behaved problems, if the dimension ''n'' is not too large.
-'''Local optimization''' is the ''much easier'' problem of finding a feasible point '''x''' that is only a ''local'' minimum: ''f''('''x''') is less than or equal to the value of ''f'' for all nearby feasible points (the intersection of the feasible region with at least some small neighborhood of '''x'''). In general, a nonlinear optimization problem may have ''many'' local minima, and which one is located by an algorithm typically depends upon the starting point that the user supplies to the algorithm. Local optimization algorithms, on the other hand, can often quickly locate a local minimum even in very high-dimensional problems (especially using gradient-based algorithm).+'''Local optimization''' is the ''much easier'' problem of finding a feasible point '''x''' that is only a ''local'' minimum: ''f''('''x''') is less than or equal to the value of ''f'' for all nearby feasible points (the intersection of the feasible region with at least some small neighborhood of '''x'''). In general, a nonlinear optimization problem may have ''many'' local minima, and which one is located by an algorithm typically depends upon the starting point that the user supplies to the algorithm. Local optimization algorithms, on the other hand, can often quickly locate a local minimum even in very high-dimensional problems (especially using gradient-based algorithms).
There is a special class of '''convex optimization''' problems, for which both the objective and constraint functions are convex, where there is only one local minimum so that a local optimization method find the global optimum. Typically, convex problems arise from functions of special analytical forms, such as linear programming problems, semidefinite programming, quadratic programming, and so on, and specialized techniques are available to solve these problems very efficiently. NLopt includes only general methods that do not assume convexity; if you have a convex problem you may be better off with a different software package, such as the [http://www.stanford.edu/~boyd/cvx/ CVX package] from Stanford. There is a special class of '''convex optimization''' problems, for which both the objective and constraint functions are convex, where there is only one local minimum so that a local optimization method find the global optimum. Typically, convex problems arise from functions of special analytical forms, such as linear programming problems, semidefinite programming, quadratic programming, and so on, and specialized techniques are available to solve these problems very efficiently. NLopt includes only general methods that do not assume convexity; if you have a convex problem you may be better off with a different software package, such as the [http://www.stanford.edu/~boyd/cvx/ CVX package] from Stanford.

Revision as of 02:30, 12 November 2008

NLopt
Download
Release notes
FAQ
NLopt manual
Introduction
Installation
Tutorial
Reference
Algorithms
License and Copyright

In this chapter of the manual, we begin by giving a general overview of the optimization problems that NLopt solves, the key distinctions between different types of optimization algorithms, and comment on ways to cast various problems in the form NLopt requires. We also describe the background and goals of NLopt.

Contents

Optimization problems

NLopt addresses general nonlinear optimization problems of the form:

\min_{\mathbf{x}\in\mathbb{R}^n} f(\mathbf{x}),

where f is the objective function and x represents the n optimization parameters. This problem may be subject to the bound contraints

lb_i \leq x_i \leq ub_i for i=1,\ldots,n

given lower bounds lb and upper bounds ub (which may be −∞ and/or +∞, respectively, for partially or totally unconstrained problems). One may also optionally have m nonlinear inequality constraints

fc_i(\mathbf{x}) \leq 0 for i=1,\ldots,m.

A point x that satisfies all of the bound and inequality constraints is called a feasible point, and the set of all feasible points is the feasible region.

Note: in this introduction, we follow the usual mathematical convention of letting our indices begin with one. In the C programming language, however, NLopt follows C's zero-based conventions (e.g. i goes from 0 to m−1 for the constraints).

Global versus local optimization

NLopt includes algorithms to attempt either global or local optimization of the objective.

Global optimization is the problem of finding the feasible point x that minimizes the objective f(x) over the entire feasible region. In general, this can be a very difficult problem, becoming exponentially harder as the number n of parameters increases. In fact, unless special information about f is known, it is not even possible to be certain whether one has found the true global optimum, because there might be a sudden dip of f hidden somewhere in the parameter space you haven't looked at yet. However, NLopt includes several global optimization algorithms that work well on reasonably well-behaved problems, if the dimension n is not too large.

Local optimization is the much easier problem of finding a feasible point x that is only a local minimum: f(x) is less than or equal to the value of f for all nearby feasible points (the intersection of the feasible region with at least some small neighborhood of x). In general, a nonlinear optimization problem may have many local minima, and which one is located by an algorithm typically depends upon the starting point that the user supplies to the algorithm. Local optimization algorithms, on the other hand, can often quickly locate a local minimum even in very high-dimensional problems (especially using gradient-based algorithms).

There is a special class of convex optimization problems, for which both the objective and constraint functions are convex, where there is only one local minimum so that a local optimization method find the global optimum. Typically, convex problems arise from functions of special analytical forms, such as linear programming problems, semidefinite programming, quadratic programming, and so on, and specialized techniques are available to solve these problems very efficiently. NLopt includes only general methods that do not assume convexity; if you have a convex problem you may be better off with a different software package, such as the CVX package from Stanford.

Gradient-based versus derivative-free algorithms

Equivalent formulations of optimization problems

Background and goals of NLopt

Personal tools