tl;dr - By performing gradient descent on the Moreau envelope with adaptive time steps, we can globally minimize nonconvex functions.


Computing tasks may often be posed as optimization problems. The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable. State-of-the-art methods for solving these problems typically only guarantee convergence to local minima. This work presents Hamilton-Jacobi-based Moreau Adaptive Descent (HJ-MAD), a zero-order algorithm with guaranteed convergence to global minima, assuming continuity of the objective function. The core idea is to compute gradients of the Moreau envelope of the objective (which is "piece-wise convex") with adaptive smoothing parameters. Gradients of the Moreau envelope \rev{(\ie proximal operators)} are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation. Our numerical examples illustrate global convergence.
Toy example illustration of HJ-MAD in action.


title={{Global Solutions to Nonconvex Problems by Evolution of Hamilton-Jacobi PDEs}},
author={Heaton, Howard and Wu Fung, Samy and Osher, Stanley},
journal={arXiv preprint arXiv:2202.11014},