Constrained Optimization with Lagrange Multipliers
The extreme and saddle points are determined for functions with 1, 2 or more variables.
In the case of 2 or more variables, you can specify up to 2 constraints.
The method of Lagrange multipliers is used to search for extreme points with constraints.
For this purpose, all first and second partial derivatives of the objective function or the Lagrange function and the bordered Hessian matrix are created and output.
Lagrange Multipliers
If an optimization problem with constraints is to be solved,
for each constraint an additional parameter is introduced as a Lagrangian multiplier (λi).
The constraints are then rearranged in such a way that one hand of the equation equals 0.
The other side of the equation is multiplied by the associated multiplier λi and then added to the objective function under study.
This gives the Lagrange function, which now added to the original variables also depends on the introduced multipliers.
Necessary condition
Necessary for the existence of a local extremum of an at least once partially differentiable function is
that all first-order partial derivatives with respect to all their n variables equal 0.
This requirement thus provides a system of n equations.
This actually applies to optimization tasks without constraints.
But it also applies to the determination of extrema with constraints, now for the Lagrange function.
If you solve this system of equations, you get all the so-called stationary points.
But not every stationary point is an extremum.
Classification of stationary points
Stationary points (critical points) can be classified into minima (min), maxima (max) and saddle points (s.p.).
In the case of a function f(x) of only one variable, one examines the sequence of n-th derivatives (n=2,3,4...)
f''(xs), f'''(xs), f''''(xs), ... .
Let n be the first time in this sequence for which f(n)(xs) ≠ 0 is fulfilled. Then:
If this order n is an even number, then there is a saddle point, otherwise an extremum and for which the following applies:
If f(n)(xs) > 0, then there is a minimum at xs.
If f(n)(xs) < 0, then there is a maximum at xs.
In the case of a function of several variables without constraints, one examines the n×n Hesse matrixH.
This consists of all possible second-order partial derivatives of the objective function with respect to the n variables: hi,k=∂2f/∂xi∂xk.
If H(xs) has only positive eigenvalues, xs is a minimum.
If H(xs) has only negative eigenvalues, xs is a maximum.
If H(xs) has negative and positive eigenvalues and no eigenvalue 0, it is at xs a saddle point.
If H(xs) has 0 as an eigenvalue and otherwise only positive eigenvalues, it is xs a saddle point or minimum.
If H(xs) has 0 as an eigenvalue and otherwise only negative eigenvalues, it is xs a saddle point or maximum.
If H(xs) has 0 as an n-fold eigenvalue, then no classification can be made for xs.
In the case of a function of several variables with constraints one uses the bordered Hessian matrix.
The bordered Hessian matrix consists of all possible partial derivatives of the 2nd order of the Lagrange function according to the n variables and the k introduced Lagrange multipliers.
With k constraints and n unknowns, it therefore has the order m×m, with m = k+n.
The k additional rows and columns compared to a Hessian matrix form a boundary of the underlying actual Hessian matrix.
For the classification, the sign change of the principal minors of the bordered Hessian matrix is examined.