Mathematics Colloquia and Seminars
Invariance in First-Order Convex OptimizationMathematics of Data and Decisions
|Speaker:||Jelena Diakonikolas, UC Berkeley|
|Start time:||Tue, Feb 5 2019, 4:10PM|
First-order methods with optimal worst-case iteration complexities for various classes of convex optimization problems have been known for decades. However, their standard analysis bears little intuition, which makes their generalization to new problems arising in modern data analysis applications challenging. In this talk, I will present a novel general and unifying framework for the analysis of first-order methods. The framework relies on the construction of a primal-dual gap and can be likened to Lagrangian mechanics: continuous-time methods and their convergence analysis follow directly from enforcing a certain invariance. The discrete-time methods can then be obtained by using different methods of discretization. This stands in contrast with standard approaches, which can be thought of as being analogous to Newtonian mechanics, where an action (i.e., a predetermined optimization method) is applied to a system and then its behavior (convergence) analyzed. To illustrate the power and generality of our framework, I will show that it recovers a wide range of known first-order methods in a unifying manner, and that it also yields novel methods that outperform existing ones on problems arising in machine learning applications. Based on joint work with Lorenzo Orecchia.