# Mathematics Colloquia and Seminars

### Stochastic Algorithms for Large-Scale Machine Learning Problems

Special Events

 Speaker: Shiqian Ma, The Chinese Univ. of Hong Kong Location: 2112 MSB Start time: Mon, Jan 30 2017, 5:10PM

Stochastic gradient descent (SGD) method and its variants are the
main approaches for solving machine learning problems that involve
large-scale training dataset. This talk addresses two issues in SGD. (i)
One of the major issues in SGD is how to choose the step size while running
the algorithm. Since the traditional line search technique does not apply
for stochastic optimization algorithms, the common practice in SGD is
either to use a diminishing step size, or to tune a fixed step size by
hand, which can be time consuming in practice. We propose to use the
Barzilai-Borwein method to automatically compute step sizes for SGD and its
results on classification problems using SVM and neural networks are reported.