Cite Us
pyOpt

Previous topic

Acknowledgments

Next topic

Download

GlossaryΒΆ

Ant Colony Optimization (ACO)
Population-based stochastic global optimization algorithm based on the behavior of ant colonies, particularly their ability to collectively determine shortest paths through the cumulative affect of pheromones.
Automatic Differentiation
A process for evaluating derivatives of a function that depends only on an algorithmic specification of the function, such as a computer program.
Constraint
Restriction that a design variables must satisfy, typically denoted in a mathematical program standard form as an inequality, g(x) <= 0, or equality, h(x)=0.
Genetic algorithm (GA)
Population-based stochastic global optimization algorithm inspired by the mechanisms of genetics, evolution, and survival of the fittest.
Global Maximum (or Minimum)
A feasible solution that maximizes (or minimizes) the value of the objective function over the entire design space feasible region.
Global Optimizer
Optimization method that implements an algorithm that is designed to find a globally optimal solution for various kinds of nonconvex programming problems.
Local Maximum (or Minimum)
A feasible solution that maximizes (or minimizes) the value of the objective function within a local neighborhood of that solution.
Lower Bound
A constraint that specifies a minimum feasible value of an individual design variable.
Numerical Optimization
Mathematical techniques and procedures used to make a system or design as effective and/or functional as possible
Objective Function
The (real-valued) function to be optimized, typically denoted in a mathematical program standard form as f.
Particle Swarm Optimization (PSO)
Population-based stochastic global optimization algorithm based on the optimal swarm behavior of animals, like bird flocking and bees.
Sequential Linear Programming (SLP)
Gradient-based local optimization algorithm based on solving successive first order approximations of a nonlinear programming problem objective subject to a linearization of the constraints. The linear approximations are usually done by using the first-order Taylor expansion.
Sequential Quadratic Programming (SQP)
Gradient-based local optimization algorithm based on solving successive second order approximations of a nonlinear programming problem objective subject to a linearization of the constraints. The approximations are usually done by using the second-order Taylor expansion.
Sequential Unconstrained Minimization Technique (SUMT)
Convex programming algorithm that convert the original constrained optimization problem to a sequence of unconstrained optimization problems whose optimal solutions converge to an optimal solution for the original problem.
Upper Bound
A constraint that specifies a maximum feasible value of an individual design variable.