HJ-Prox
tl;dr - We give a formula for estimating proximal operators from (possibly noisy) observations of objective function values.
First-order optimization algorithms are widely used today. Two standard building blocks in these algorithms are proximal operators (proximals) and gradients. Although gradients can be computed for a wide array of functions, explicit proximal formulas are only known for limited classes of functions. We provide an algorithm, HJ-Prox, for accurately approximating such proximals. This is derived from a collection of relations between proximals, Moreau envelopes, Hamilton-Jacobi (HJ) equations, heat equations, and importance sampling. In particular, HJ-Prox smoothly approximates the Moreau envelope and its gradient. The smoothness can be adjusted to act as a denoiser. Our approach applies even when functions are only accessible by (possibly noisy) blackbox samples. We show HJ-Prox is effective numerically via several examples.
A Hamilton-Jacobi-based Proximal Operator
arXiv.org
arXiv Preprint
Video Overview of HJ-Prox

Example of HJ-Prox algorithm using an increasing number of samples to form an estimate (green dot) of the proximal (red dot). Samples
are drawn from a Gaussian distribution centered at
, and a convex combination of these samples is used to get the estimate, with the weight of each sample a function of the objective value
.
HJ-Prox Slides
HJ_Prox_Slides.pdf
1MB
PDF
Download link for HJ-Prox slides
@article{osher2022hamilton,
title={{A Hamilton-Jacobi-based Proximal Operator}},
author={Osher, Stanley and Heaton, Howard and Wu Fung, Samy},
journal={arXiv preprint arXiv:2211.12997},
year={2022}
}
Last modified 1mo ago