Mathematical Foundations for Data Science (Spring 2019)
Lecture notes and Slides:
-
Lecture 0 (Introduction).
-
Lectures 1 and 2 (Curses, Blessings, Surprises, basic probability)
here and
here.
-
Lectures 3 and 4: (PCA and SVD)
here.
-
Lectures 5, 6, and 7: (Deep Learning)
A nice introduction to Deep
Learning, well suited for applied mathematicians.
-
Lectures 8, 9, and 10: (Clustering, Diffusion maps, PageRank)
here
and
here.
-
Lectures 11 and 12: (Johnson-Lindenstrauss and variations)
here
-
Lectures 13: (Randomized algorithms)
here
-
Lectures 14, 15 and 16: (Compressive sensing and sparsity)
here
Homework:
Relevant papers, software, and interesting links:
Data Sets:
Artificial Intelligence: Issues and Challenges:
General Data Science:
Probability, concentration of measure, random matrices:
Deep Learning/Classification:
Clustering
Linear dimension reduction:
Randomized Algorithms:
Compressive sensing:
Low-rank matrix recovery and variations: