MARKOV CHAINS AND MIXING TIMES COURSE

The main component in the running time of the MCMC algorithm is the “mixing time” of the underlying Markov chain., i.e., the number of steps one needs to run the chain to approximate the stationary distribution well.

Welcome to the webpage of this course on Markov chains and mixing times. The course starts with material from the book https://www.yuval-peres-books.com/markov-chains-and-mixing-times/ and will continue with more advanced material based on recent research. As the course progresses, we will include here lecture notes and links to lecture videos.

Lectures

Last uploaded: Lecture 2 (Week 10)
Week 1: Basic definitions and examples

1.1.  Why Markov Chains?

1.2. Random Mapping Representation 

1.3. Irreducibility and Aperiodicity 

1.4. Random Walks on Graphs 

1.5. Stationary Distributions 

1.6. Reversibility  

1.7. Eulerian graphs
1.8. Harmonic functions

1.9. Gambler's Ruin

Week 2: Mixing, coupling and stationary times

2.1. Time Reversals

2.2. Coupon Collecting

2.3. Total Variation Distance and coupling

2.4. The Convergence Theorem

2.5. Mixing in total variation
2.6. Mixing via coupling

2.7. The cycle and the Hypercube
2.8. Stationary Times
2.9. Strong Stationary Times and separation distance 

Week 3: Separation distance and lower bounds

3.1. Top-to-Random Shuffle

3.2. Optimal strong stationary times

3.3. Comparing separation distance and total variation

3.4. Birth and death chains

3.5. A markov chain from linear algebra 
3.6. Counting and diameter lower bounds 
3.7. Bottleneck Ratio 

Week 4: Riffle shuffles, Random Walks and electrical networks

4.1. Riffle shuffles
4.2.Networks and Reversible Markov Chains  

4.3. Voltages and Current Flows 

4.4. Computing effective resistance
4.5. Thomson's principle
4.6. Nash-Williams inequality
4.6. Commute Time 

Week 5: From hitting times to Cover Times

5.1. Bounding Mixing Times via Hitting Times
5.2. Eigenvalues

5.3. Matthews' bounds for cover time
5.4. Varopolous-Carne long range inequality

5.5. A stronger diameter lower bound for mixing time

Week 6: Eigenvalues and path coupling

6.1. The Spectral Representation of a Reversible Transition Matrix 

6.2. The Relaxation Time

6.3. The Dirichlet Form and the Bottleneck Ratio 

6.4. Comparison of Markov Chains 

6.5. Expander Graphs 
6.6. The Transportation metric
6.7. Path Coupling  
6.8. The Ising model 

 

Week 7: The Ising Model

7.1. A topological obstruction for the Ising model
7.2. Open problem: Monotonicity of relaxation time
7.3. Monotonicity of top eigenfunction
7.4. Harris' inequality
7.5. Positive correlations for the Ising model

Week 8: The Transportation Metric and Path Coupling

 

8.2. Rapid Mixing for Colorings 

8.3. Approximate Counting 

8.4. Curvature of Markov chains

Week 9: Simulation and perfect sampling

9.1. Monte Carlo sampling

9.2. unbiasing random bits

9.3. Glauber and Metropolis dynamics

9.4. Coupling from the past

9.5. sampling uniform spanning trees

Week 10: The Ising Model

10.1. Fast Mixing at High Temperature 

10.2. The Complete Graph 

10.3. The Cycle 

10.4. The Tree 

10.5. Block Dynamics 

10.6. Lower Bound for Ising on Square 

Week 11: The Cutoff Phenomenon and continuous time chains

11.1. The hypercube versus the cycle

11.2. The product condition and the Aldous-Pak examples

11.3. Equivalence of continuous time chains and lazy chains.

11.4. birth-death chains and cutoff on trees

11.5. Ramanujan graphs

Week 12: Research directions

12.1. Universal lower bound for Ising, open for Potts

12.2. Swendsen -Wang Dynamics: the exponent quarter conjecture

12.3. Monotone subsets of the cube: The Ding-Mossel Theorem

12.4. Random walks on invertible matrices modulo 2

12.5. Random walk on percolation clusters and dynamical percolation

Yuval Peres

About Yuval Peres

Yuval Peres is a mathematician working in probability theory and its interface with computing, PDE, game theory, and machine learning.
He has coauthored six books, available at https://www.yuval-peres-books.com/ and mentored more than twenty PhD students, see https://www.genealogy.math.ndsu.nodak.edu/id.php?id=22523 Many of these students are leading researchers in universities such as MIT, Toronto, Tel Aviv, Peking University, University of Oregon, T.U. Munich etc.
Yuval's best known results concern the fine properties of Random walks and Brownian motion, especially cover time and thick points, as well as basic results for random walks on Galton Watson trees and random graphs. He received the Rollo Davidson prize, the Loeve prize, and is an international member of the US National Academy of Sciences.