Completed papers:
Towards a Complete Analysis of Langevin Monte Carlo: Beyond Poincaré Inequality (with Alireza Mousavi-Hosseini, Tyler Farghly, Krishna Balasubramanian and Murat A. Erdogdu), to appear in COLT, 2023.
High-dimensional Scaling Limits and Fluctuations of online Least-squares SGD with Smooth Covariance (with Krishna Balasubramanian and Promit Ghosal), 2023.
Regularized Stein Variational Gradient Flow (with Krishna Balasubramanian, Bharath Sriperumbudur and Jianfeng Lu), 2022.
Mean-square Analysis of Discretized Itô Diffusions for Heavy-tailed Sampling (with Tyler Farghly, Krishna Balasubramanian and Murat A. Erdogdu), submitted to the journal of Machine Learning, 2022.
An Analysis of Transformed Unadjusted Langevin Algorithm for Heavy-tailed Sampling (with Krishna Balasubramanian and Murat A. Erdogdu), submitted to IEEE Transactions on Information Theory, 2022.
On the Ergodicity, Bias and Asymptotic Normality of Randomized Midpoint Sampling Method (with Krishna Balasubramanian and Murat A. Erdogdu), NeurIPS[link], 2020.
Work in progress:
Adaptive Langevin Monte Carlo Methods for Heavy-tailed Sampling via Weighted Functional Inequalities (with Tyler Farghly, Jun Yang and Patrick Rebeschini), 2023.
Complexity Analysis of Proximal Samplers for Heavy-tailed Sampling (with Krishna Balasubramanian and Sinho Chewi), 2023.
High-dimensional Scaling Limits for Kernel Ridge Regression (with Krishna Balasubramanian and Xiucai Ding), 2023.
Exponential Ergodicity and Concentration of Stable-driven Diffusions under Fractional Poincaré Inequality (with Krishna Balasubramanian), 2023.