Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


In mathematics, the term optimization refers to the study of problems that have the form

Given: a function f : A R from some set A to the real numbers
Sought: an element x0 in A such that f(x0) ≤ f(x) for all x in A ("minimization") or such that f(x0) ≥ f(x) for all x in A ("maximization").

Such a formulation is sometimes called a mathematical program (a term not directly related to computer programming, but still in use for example for linear programming - see history below). Many real-world and theoretical problems may be modeled in this general framework.

Typically, A is some subset of the Euclidean space Rn, often specified by a set of constraints, equalities or inequalities that the members of A have to satisfy. The elements of A are called feasible solutions. The function f is called an objective function, or cost function. A feasible solution that minimizes (or maximizes, if that is the goal) the objective function is called an optimal solution.

The domain A of f is called the search space, while the elements of A are called candidate solutions or feasible solutions.

In general, there will be several local minima and maxima, where a local minimum x* is defined as a point such that for some δ > 0 and all x such that

;

the formula

holds; that is to say, on some ball around x* all of the function values are greater than or equal to the value at that point. Local maxima are defined similarly. In general, it is easy to find local minima — additional facts about the problem (e.g. the function being convex) are required to ensure that the solution found is a global minimum.

Notation[]

Optimization problems are often expressed with special notation. Here are some examples:

This asks for the minimum value for the expression x2 + 1, where x ranges over the real numbers R. The minimum value in this case is 1, occurring at x = 0.

This asks for the maximum value for the expression 2x, where x ranges over the reals. In this case, there is no such maximum as the expression is unbounded, so the answer is "infinity" or "undefined".

This asks for the value(s) of x in the interval [−∞, −1] which minimizes the expression x2 + 1. (The actual minimum value of that expression does not matter.) In this case, the answer is x = −1.

This asks for the (xy) pair(s) that maximize the value of the expression x·cos(y), with the added constraint that x lies in the interval [−5, 5]. (Again, the actual maximum value of the expression does not matter.) In this case, the solutions are the pairs of the form (5, 2πk) and (−5, (2k + 1)π), where k ranges over all integers.

Major subfields[]

  • Linear programming studies the case in which the objective function f is linear and the set A is specified using only linear equalities and inequalities.
  • Integer programming studies linear programs in which some or all variables are constrained to take on integer values.
  • Quadratic programming allows the objective function to have quadratic terms, while the set A must be specified with linear equalities and inequalities.
  • Nonlinear programming studies the general case in which the objective function or the constraints or both contain nonlinear parts.
  • Stochastic programming studies the case in which some of the constraints depend on random variables.
  • Dynamic programming studies the case in which the optimization strategy is based on splitting the problem into smaller subproblems.
  • Combinatorial optimization is concerned with problems where the set of feasible solutions is discrete or can be reduced to a discrete one.
  • Infinite-dimensional optimization studies the case when the set of feasible solutions is a subset of an infinite-dimensional space, such as a space of functions.
  • Constraint satisfaction studies the case in which the objective function f is constant (this is a big deal in AI, particularly in Automated reasoning).

Techniques[]

For twice-differentiable functions, unconstrained problems can be solved by finding the points where the gradient of the objective function is zero (that is, the stationary points) and using the Hessian matrix to classify the type of each point. If the Hessian is positive definite, the point is a local minimum, if negative definite, a local maximum, and if indefinite it is some kind of saddle point.

One can find the stationary points by starting with a guess for a stationary point, and then iterate towards it by using methods such as

  • gradient descent
  • Newton's method
  • conjugate gradient
  • line search

Should the objective function be convex over the region of interest, then any local minimum will also be a global minimum. There exist robust, fast numerical techniques for optimizing doubly differentiable convex functions.

Constrained problems can often be transformed into unconstrained problems with the help of Lagrange multipliers.

Here are a few other popular methods:

Uses[]

Problems in rigid body dynamics (in particular articulated rigid body dynamics) often require mathematical programming techniques, since you can view rigid body dynamics as attempting to solve an ordinary differential equation on a constraint manifold; the constraints are various nonlinear geometric constraints such as "these two points must always coincide", "this surface must not penetrate any other", or "this point must always lie somewhere on this curve". Also, the problem of computing contact forces can be done by solving a linear complementarity problem, which can also be viewed as a QP (quadratic programming problem).

Many design problems can also be expressed as optimization programs. This application is called design optimization. One recent and growing subset of this field is multidisciplinary design optimization, which, while useful in many problems, has in particular been applied to aerospace engineering problems.

Another field that uses optimization techniques extensively is operations research.

History[]

Historically, the first term to be introduced was linear programming, which was invented by George Dantzig in the 1940s. The term programming in this context does not refer to computer programming (although computers are nowadays used extensively to solve mathematical programs). Instead, the term comes from the use of program by the United States military to refer to proposed training and logistics schedules, which were the problems that Dantzig was studying at the time. (Additionally, later on, the use of the term "programming" was apparently important for receiving government funding, as it was associated with high-technology research areas that were considered important.)

See also[]

References[]

External links[]

Software:


cs:Optimalizace de:Optimierung (Mathematik) es:Optimización (matemáticas) fr:Optimisation (mathématiques) id:Optimisasi he:אופטימיזציה (מתמטיקה) su:Optimisasi (matematik) vi:Tối ưu hóa (toán học) zh:最优化

This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement