Variational analysis


In mathematics, the term variational analysis usually denotes the combination and extension of methods from convex optimization and the classical calculus of variations to a more general theory. This includes the more general problems of optimization theory, including topics in set-valued analysis, e.g. generalized derivatives.
In the Mathematics Subject Classification scheme, the field of "Set-valued and variational analysis" is coded by "49J53".

History

While this area of mathematics has a long history, the first use of the term "Variational analysis" in this sense was in an eponymous book by R. Tyrrell Rockafellar and Roger J-B Wets.

Existence of Minima

A classical result is that a lower semicontinuous function on a compact set attains its minimum. Results from variational analysis such as Ekeland's variational principle allow us to extend this result of lower semicontinuous functions on non-compact sets provided that the function has a lower bound and at the cost of adding a small perturbation to the function.

Generalized derivatives

The classical Fermat's theorem says that if a differentiable function attains its minimum at a point, and that point is an interior point of its domain, then its derivative must be zero at that point. For problems where a smooth function must be minimized subject to constraints which can be expressed in the form of other smooth functions being equal to zero, the method of Lagrange multipliers, another classical result, gives necessary conditions in terms of the derivatives of the function.
The ideas of these classical results can be extended to nondifferentiable convex functions by generalizing the notion of derivative to that of subderivative. Further generalization of the notion of the derivative such as the Clarke generalized gradient allow the results to be extended to nonsmooth locally Lipschitz functions.