CSE 788.01 - Topics in Randomized Algorithms
In
this course we will get insight into one the greatest mysteries of
computer science: Does randomization help? Randomization frequently
makes algorithms faster or simpler. We will study the basic
ingredients in the analysis of randomized algorithms, see some of the
important examples and discuss select recent developments in
randomized algorithms, with an aim towards understanding and working on
some research questions. Topics will include the complexity of
randomized algorithms, probabilistic inequalities, Markov chain Monte
Carlo and lower bound techniques. Among the examples, there will be
data
analysis (the matrix approximation problem), sampling and counting of
combinatorial and geometric structures, integration and convex
optimization.
Grading:
Depending on the number of students and interest, a combination of
(some of, but not necessarily all of) homework, paper presentations
and working on a research project (definitely no exams).
Recommended readings:
Helpful courses (NOT required):
A course about algorithms (such as 780) and a course about the theory
of computation (such as 725).
Topics (tentative, depending on interest and background):
Randomized algorithms, basic paradigms, basic examples: primality
testing, MIN-CUT. Analysis.
Complexity of randomized algorithms. RP, coRP, BPP, ZPP. BPP in P/poly
(non-uniform derandomization)
Lower bounds. Yao's min-max principle.
Probabilistic tools. Concentration Inequalities. Markov, Tchebycheff,
Chernoff. Examples. Azuma? Talagrand? McDiarmid?
Counting and Sampling:
#P, Markov chains, MCMC, approximation of permanent.
Geometric random walks. Randomized algorithms for
sampling, integration, optimization, centroid, covariance.
Isoperimetric inequalities.
Introduction to the probabilistic method
Shannon's theorem. Error correcting codes.
Derandomization. Example: Kashin decomposition
(non-uniformity revisited). Connection with compressed sensing?
Expander Graphs
Randomized and explicit constructions. The zig-zag
product.
Undirected connectivity in L.
Random projection?
Numerical linear algebra. SVD. Matrix approximation. Column subset
selection.
Property testing?
Papers/topics for presentation (tentative):
- Moser, Robin A.; Tardos, Gabor (2009), A constructive proof of
the general Lovász Local Lemma, arXiv:0903.0544
http://blog.computationalcomplexity.org/2009/06/kolmogorov-complexity-proof-of-lov.html
http://en.wikipedia.org/wiki/Algorithmic_Lovász_local_lemma
Also, Navin's paper.
- A simple construction
of almost-Euclidean subspaces of via tensor products. Piotr Indyk,
Stanislaw Szarek
http://people.csail.mit.edu/indyk/neumann.pdf
- Fast
Computation of Low Rank Matrix Approximations. Dimitris Achlioptas,
Frank McSherry.
- Graph Sparsification by
Effective Resistances. Daniel A. Spielman, Nikhil Srivastava
- Twice-Ramanujan
Sparsifiers. Joshua Batson, Daniel A. Spielman, Nikhil Srivastava.
- Improved
Approximation Algorithms for Large Matrices via Random Projections.
Tamás Sarlós.
- Finding structure with
randomness: Stochastic algorithms for constructing approximate matrix
decompositions. Nathan Halko, Per-Gunnar Martinsson, Joel A. Tropp
- An
Improved Approximation Algorithm for the Column Subset Selection Problem.
Christos Boutsidis, Michael W. Mahoney, Petros Drineas.
- Matrix
Approximation and Projective Clustering via Volume Sampling. Amit
Deshpande, Luis Rademacher, Santosh Vempala, Grant Wang. (adaptive
sampling)
- Adaptive Sampling and Fast Low-Rank Approximation. Amit Deshpande, Santosh Vempala.
- A New Approach to Strongly Polynomial Linear Programming. Mihaly
Barasz and Santosh Vempala.
- Random
walks on polytopes and an affine interior point method for linear
programming. Ravi Kannan, Hariharan Narayanan.
- A Polynomial Number of
Random Points does not Determine the Volume of a Convex Body. Ronen
Eldan.
- Solving
Convex Programs by Random Walks. D. Bertsimas, S. Vempala.
- Learning
Geometric Concepts via Gaussian Surface Area. A. Klivans, R.
O'Donnell, R. Servedio.
- A paper on computational complexity.
- Some of the topics in the list of class topics.
(Kelner-Madry, Farbod's
paper on random spanning trees, volume-preserving
random projection, Gittens-Tropp, Deshpande-Vempala, Kashin's paper
on compressive sensing)
Topics by lecture (tentative):