Least squares recovery methods provide a simple and practical way to approximate equations in nondivergence form where standard variational approach either fails or requires technically complex modifications. In this talk I will explain a new method based on the recovery of the Hessian and the gradient for equations for the unknown u : Ω → R of the form A(x ):D2u(x ) +b (x )· ∇u(x ) − c (x )u(x ) = f (x ) and u|∂ Ω = r, (1) where the coefficients A,b and c satisfy the Cordes conditions. The technique consists in introducing both at the continuum and discrete stages two extra unknowns for the gradient and the Hessian of u: this leads to a unified approach to a Hessian variant and a Hessianless (also known as gradient-only) variant extending previous work. Suitable functional spaces and penalties in the cost functional must be crafted in order to ensure stability and convergence of the scheme with a good approximation of the gradient and Hessian which is useful, for example, for Newton– Raphson or a Howard-based approximation of a Hamilton–Jacobi–Bellman formulation of fully nonlinear elliptic equation the Monge–Ampère equation. I will cover also variations including a Hessianless version (when no Hessian recovery is required and slims down the computational load). Our approach is developped independently of the Maximum Principle and allows the use of higher degree elements.

Research Group:

Omar Lakkis

Schedule:

Monday, June 28, 2021 - 12:00

Location:

A-133

Location:

Hybrid: in presence and online

Abstract: