Mathematics Colloquia and Seminars

Return to Colloquia & Seminar listing

Applied Differential Geometry and Harmonic Analysis in Deep Learning Regularization

Mathematics of Data & Decisions

Speaker: Wei Zhu, UMass Amherst
Related Webpage: https://ucdavis.zoom.us/j/99584782532
Location: Zoom
Start time: Tue, May 4 2021, 12:31PM

With the explosive production of digital data and information, data-driven methods, deep neural networks (DNNs) in particular, have revolutionized machine learning and scientific computing by gradually outperforming traditional hand-craft model-based algorithms. While DNNs have proved very successful when large training sets are available, they typically have two shortcomings: First, when the training data are scarce, DNNs tend to suffer from overfitting. Second, the generalization ability of overparameterized DNNs still remains a mystery despite many recent efforts. In this talk, I will discuss two works to “inject” the “modeling” flavor back into deep learning to improve the generalization performance and interpretability of DNNs. This is accomplished by deep learning regularization through applied differential geometry and harmonic analysis. In the first part of the talk, I will explain how to improve the regularity of the DNN representation by imposing a “smoothness” inductive bias over the DNN model. This is achieved by solving a variational problem with a low-dimensionality constraint on the data-feature concatenation manifold. In the second part, I will discuss how to impose scale-equivariance in network representation by conducting joint convolutions across the space and the scaling group. The stability of the equivariant representation to nuisance input deformation is also proved under mild assumptions on the Fourier-Bessel norm of filter expansion coefficients.