CSE 788.01 - Topics in Randomized Algorithms
this course we will get insight into one the greatest mysteries of
computer science: Does randomization help? Randomization frequently
makes algorithms faster or simpler. We will study the basic
ingredients in the analysis of randomized algorithms, see some of the
important examples and discuss select recent developments in
randomized algorithms, with an aim towards understanding and working on
some research questions. Topics will include the complexity of
randomized algorithms, probabilistic inequalities, Markov chain Monte
Carlo and lower bound techniques. Among the examples, there will be
analysis (the matrix approximation problem), sampling and counting of
combinatorial and geometric structures, integration and convex
Depending on the number of students and interest, a combination of
(some of, but not necessarily all of) homework, paper presentations
and working on a research project (definitely no exams).
Helpful courses (NOT required):
A course about algorithms (such as 780) and a course about the theory
of computation (such as 725).
Topics (tentative, depending on interest and background):
Randomized algorithms, basic paradigms, basic examples: primality
testing, MIN-CUT. Analysis.
Complexity of randomized algorithms. RP, coRP, BPP, ZPP. BPP in P/poly
Lower bounds. Yao's min-max principle.
Probabilistic tools. Concentration Inequalities. Markov, Tchebycheff,
Chernoff. Examples. Azuma? Talagrand? McDiarmid?
Counting and Sampling:
#P, Markov chains, MCMC, approximation of permanent.
Geometric random walks. Randomized algorithms for
sampling, integration, optimization, centroid, covariance.
Introduction to the probabilistic method
Shannon's theorem. Error correcting codes.
Derandomization. Example: Kashin decomposition
(non-uniformity revisited). Connection with compressed sensing?
Randomized and explicit constructions. The zig-zag
Undirected connectivity in L.
Numerical linear algebra. SVD. Matrix approximation. Column subset
Papers/topics for presentation (tentative):
paper on random spanning trees, volume-preserving
random projection, Gittens-Tropp, Deshpande-Vempala, Kashin's paper
on compressive sensing)
- Moser, Robin A.; Tardos, Gabor (2009), A constructive proof of
the general Lovász Local Lemma, arXiv:0903.0544
Also, Navin's paper.
- A simple construction
of almost-Euclidean subspaces of via tensor products. Piotr Indyk,
Computation of Low Rank Matrix Approximations. Dimitris Achlioptas,
- Graph Sparsification by
Effective Resistances. Daniel A. Spielman, Nikhil Srivastava
Sparsifiers. Joshua Batson, Daniel A. Spielman, Nikhil Srivastava.
Approximation Algorithms for Large Matrices via Random Projections.
- Finding structure with
randomness: Stochastic algorithms for constructing approximate matrix
decompositions. Nathan Halko, Per-Gunnar Martinsson, Joel A. Tropp
Improved Approximation Algorithm for the Column Subset Selection Problem.
Christos Boutsidis, Michael W. Mahoney, Petros Drineas.
Approximation and Projective Clustering via Volume Sampling. Amit
Deshpande, Luis Rademacher, Santosh Vempala, Grant Wang. (adaptive
- Adaptive Sampling and Fast Low-Rank Approximation. Amit Deshpande, Santosh Vempala.
- A New Approach to Strongly Polynomial Linear Programming. Mihaly
Barasz and Santosh Vempala.
walks on polytopes and an affine interior point method for linear
programming. Ravi Kannan, Hariharan Narayanan.
- A Polynomial Number of
Random Points does not Determine the Volume of a Convex Body. Ronen
Convex Programs by Random Walks. D. Bertsimas, S. Vempala.
Geometric Concepts via Gaussian Surface Area. A. Klivans, R.
O'Donnell, R. Servedio.
- A paper on computational complexity.
- Some of the topics in the list of class topics.
Topics by lecture (tentative):