UC Davis Mathematics

Mathematics Colloquia and Seminars

Return to Colloquia & Seminar listing

Stochastic methods for nonsmooth nonconvex optimization

Mathematics of Data and Decisions

Speaker: Dmitriy Drusvyatskiy, U Washington
Related Webpage: https://sites.math.washington.edu/~ddrusv/
Location: 1147 MSB
Start time: Tue, Feb 19 2019, 4:10PM

Stochastic iterative methods lie at the core of large-scale optimization and its modern applications to data science. Though such algorithms are routinely and successfully used in practice on highly irregular problems, few performance guarantees are available outside of smooth or convex settings. In this talk, I will describe a framework for designing and analyzing stochastic methods on a large class of nonsmooth and nonconvex problems, with provable efficiency guarantees. The problem class subsumes such important tasks as phase retrieval, robust PCA, and minimization of risk measures, while the methods include stochastic subgradient, Gauss-Newton, and proximal point iterations. The main thread of the proposed framework is appealingly intuitive. I will show that a wide variety of stochastic methods can be interpreted as inexact gradient descent on an implicit smoothing of the problem. Optimal learning rates and novel sample-complexity bounds follow quickly from this viewpoint.