The Second Derivative Test eigenvalues of the Hessian, Real Analysis II
In this video, we begin our study of how to optimize class C^2 scalar-valued functions of several variables. We assume the function is twice continuously differentiable on its domain, which may not be open, since optimization often involves considering behavior on the boundary.
The first question is where do local maxima and minima occur? We find critical points in the interior of the domain using the gradient to extend the first derivative test from single-variable calculus. This gives points where the function may have local extreme values.
Then using the second-order Taylor polynomial, we see how the Hessian matrix captures the local curvature of the function near a critical point. The second derivative test is based on the idea that close to a critical point, the behavior of the function is dominated by the quadratic part of the Taylor expansion.
By analyzing the associated quadratic form and applying results from linear algebra (notably the spectral theorem) we learn how to classify critical points as local maxima, local minima, or saddle points, based on the signs of the eigenvalues of the Hessian. (This part of the lectures reviews how symmetric matrices can be diagonalized using an orthonormal basis—something you might’ve seen in linear algebra. If you haven't... hopefully you will soon!)
Along the way, we work through several examples and use MATLAB visualizations to compare the function and its second-order approximation. These help us understand why the Hessian gives conclusive results in many cases—and why semidefinite cases are inconclusive.
Summary: Second Derivative Test (Multivariable):
1. Find Critical Points: Locate points where the gradient of the function is zero.
2. Compute the Hessian: Calculate the Hessian matrix (the matrix of second partial derivatives) at each critical point.
3. Analyze the Hessian:
* Positive Definite Hessian: If all eigenvalues of the Hessian are positive, the critical point is a local minimum.
* Negative Definite Hessian: If all eigenvalues of the Hessian are negative, the critical point is a local maximum.
* Indefinite Hessian: If the Hessian has both positive and negative eigenvalues, the critical point is a saddle point. 1
Semidefinite Cases:
If the Hessian has eigenvalues that are zero, the test is inconclusive. This means the second derivative test fails to provide a definitive answer. In these situations, further analysis is required to determine the nature of the critical point. It is possible to have a local min (f(x,y) = x^4 + y^4), local max (f(x,y) = -x^4-y^4), saddle (f(x,y) = x^4 - y^4), non-strict max or min (f(x,y) = x^4 or f(x,y) = - y^4), etc.
#mathematics #math #hessian #hessianmatrix #optimization #multivariablecalculus #realanalysis #appliedmathematics #CriticalPoints #GradientVector #HessianMatrix #TaylorApproximation #SpectralTheorem #Calculus3