In the case of 2 or more variables, you can specify up to 2

The method of

For this purpose, all first and second partial derivatives of the objective function or the Lagrange function and the bordered Hessian matrix are created and output.

The constraints are then rearranged in such a way that one hand of the equation equals 0.

The other side of the equation is multiplied by the associated multiplier λ

This gives the Lagrange function, which now added to the original variables also depends on the introduced multipliers.

that all first-order partial derivatives with respect to all their n variables equal 0.

This requirement thus provides a system of n equations.

This actually applies to optimization tasks

But it also applies to the determination of extrema

If you solve this system of equations, you get all the so-called

But not every stationary point is an extremum.

In the case of a function f(x) of only

f''(x

Let n be the first time in this sequence for which f

If this order n is an even number, then there is a saddle point, otherwise an extremum and for which the following applies:

If f

If f

In the case of a function of

This consists of all possible second-order partial derivatives of the objective function with respect to the n variables: h

If

If

If

If

If

If

In the case of a function of several variables

The bordered Hessian matrix consists of all possible partial derivatives of the 2nd order of the Lagrange function according to the n variables and the k introduced Lagrange multipliers.

With k constraints and n unknowns, it therefore has the order m×m, with m = k+n.

The k additional rows and columns compared to a Hessian matrix form a boundary of the underlying actual Hessian matrix.

For the classification, the sign change of the principal minors of the bordered Hessian matrix is examined.

more JavaScript applications