Mathematics Colloquia and Seminars

Return to Colloquia & Seminar listing

Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods

Mathematics of Data & Decisions

Speaker: Ben Adcock, Simon Fraser University
Location: zoom
Start time: Tue, May 2 2023, 12:10PM

Sharpness is a generic assumption in continuous optimization that bounds the distance to the set of minimizers in terms of the suboptimality in the objective function. It leads to the acceleration of first-order optimization methods via so-called restarts. However, sharpness involves problem-specific constants that are typically unknown, and previous restart schemes often result in reduced convergence rates. Such schemes are also challenging to apply in the presence of noise or approximate model classes (e.g., in compressed sensing or machine learning problems). In this talk, we introduce the notion of approximate sharpness, a generalization of sharpness that incorporates an unknown constant perturbation to the objective function error. By employing a new type of search over the unknown constants, we then describe a restart scheme that applies to general first-order methods. Our scheme maintains the same convergence rate as when assuming knowledge of the constants. Moreover, for broad classes of problems, it gives rates of convergence which either match known optimal rates or improve on previously established rates. Finally, we demonstrate the practical efficacy of this scheme on applications including sparse recovery, compressive imaging and feature selection in machine learning.
This is joint work with Matthew J. Colbrook (Cambridge) and Maksym Neyra-Nesterenko (SFU). The corresponding paper can be found here: https://arxiv.org/abs/2301.02268