MARKOV CHAINS AND MIXING TIMES COURSE
The main component in the running time of the MCMC algorithm is the “mixing time” of the underlying Markov chain., i.e., the number of steps one needs to run the chain to approximate the stationary distribution well.
Welcome to the webpage of this course on Markov chains and mixing times. The course starts with material from the book https://www.yuval-peres-books.com/markov-chains-and-mixing-times/ and will continue with more advanced material based on recent research. As the course progresses, we will include here lecture notes and links to lecture videos.
Lectures
2.1. Time Reversals
2.2. Coupon Collecting
2.3. Total Variation Distance and coupling
2.4. The Convergence Theorem
2.5. Mixing in total variation
2.6. Mixing via coupling
2.7. The cycle and the Hypercube
2.8. Stationary Times
2.9. Strong Stationary Times and separation distance
3.1. Top-to-Random Shuffle
3.2. Optimal strong stationary times
3.3. Comparing separation distance and total variation
3.4. Birth and death chains
3.5. A markov chain from linear algebra
3.6. Counting and diameter lower bounds
3.7. Bottleneck Ratio
6.1. The Spectral Representation of a Reversible Transition Matrix
6.2. The Relaxation Time
6.3. The Dirichlet Form and the Bottleneck Ratio
6.4. Comparison of Markov Chains
6.5. Expander Graphs
6.6. The Transportation metric
6.7. Path Coupling
6.8. The Ising model
11.1. The hypercube versus the cycle
11.2. The product condition and the Aldous-Pak examples
11.3. Equivalence of continuous time chains and lazy chains.
11.4. birth-death chains and cutoff on trees
11.5. Ramanujan graphs
12.1. Universal lower bound for Ising, open for Potts
12.2. Swendsen -Wang Dynamics: the exponent quarter conjecture
12.3. Monotone subsets of the cube: The Ding-Mossel Theorem
12.4. Random walks on invertible matrices modulo 2
12.5. Random walk on percolation clusters and dynamical percolation

About Yuval Peres
Yuval Peres is a mathematician working in probability theory and its interface with computing, PDE, game theory, and machine learning.
He has coauthored six books, available at https://www.yuval-peres-books.com/ and mentored more than twenty PhD students, see https://www.genealogy.math.ndsu.nodak.edu/id.php?id=22523 Many of these students are leading researchers in universities such as MIT, Toronto, Tel Aviv, Peking University, University of Oregon, T.U. Munich etc.
Yuval's best known results concern the fine properties of Random walks and Brownian motion, especially cover time and thick points, as well as basic results for random walks on Galton Watson trees and random graphs. He received the Rollo Davidson prize, the Loeve prize, and is an international member of the US National Academy of Sciences.