清华主页 EN
导航菜单

Markov Chain Mixing

来源: 09-22

时间:15:20 - 16:55, Wed,Fri, 10/12/2022 - 1/6/2023

地点:Venue: 1129B Online: Zoom: 482 240 1589 PW: BIMSA

主讲人:Yuval Peres (Research Fellow)

Record: Yes

Level: Graduate

Language: English


Prerequisite

Undergraduate Probability and Linear Algebra, some measure theoretic probability: Conditional expectation, Laws of large numbers. The first chapter of Durrett’s graduate text or taking  the course “Probability 1” given by Prof. Hao Wu at Tsinghua will provide ample background. That course can be taken in parallel with this course.

The textbook for the course is “Markov chains and mixing times”, second edition, PDF is available from https://www.yuval-peres-books.com/markov-chains-and-mixing-times/

In the last part of the course, we will see new material that is not in that book, and explore open problems and directions of current research.


Abstract

Background: The modern theory of Markov chain mixing is the result of the convergence, in the 1980's and 1990's, of several threads:

For statistical physicists, Markov chains become useful in Monte Carlo simulation  The mixing time determines the running time for simulation.  Deep connections were found between rapid mixing and spatial properties of spin systems. In theoretical computer science, Markov chains play a key role in sampling and approximate counting algorithms.

At the same time, mathematicians were intensively studying random walks on groups. Both spectral methods and probabilistic techniques, such as coupling, played important roles. Ingenious constructions of expander graphs (on which random walks mix especially fast) were found. The connection  between eigenvalues and expansion properties was first discovered in differential geometry, but then became central to the study of Markov chain mixing.


Reference

https://www.yuval-peres-books.com/markov-chains-and-mixing-times/

However, In the second part of the course we will see new material that is not in that book, and explore open problems and directions of current research.


Syllabus

Week 1:  Basic definitions and examples

1.1.  Why Markov Chains?

1.2. Random Mapping Representation  

1.3. Irreducibility and Aperiodicity  

1.4. Random Walks on Graphs  

1.5. Stationary Distributions  

1.6. Reversibility and Time Reversals  

1.7. Gambler's Ruin  

1.8. Coupon Collecting  

1.9  The cycle and the Hypercube  

1.10 The Polya Urn Model  


Week 2: Mixing, coupling and stationary times

2.1. Total Variation Distance and coupling

2.2. The Convergence Theorem

2.3. Mixing in total variation

2.4. Uniform mixing and mixing in other norms  

2.5. Grand Couplings  

2.6. Top-to-Random Shuffle

2.7. Stationary Times  

2.8. Strong Stationary Times and separation distance  

2.9. Stationary Times and Cesaro Mixing Time  

2.10. Optimal strong stationary times


Week 3:  Lower Bounds and the symmetric group

3.1. Counting and Diameter Bounds  

3.2. Bottleneck Ratio  

3.3. Distinguishing Statistics  

3.4. The Symmetric Group  

3.5. Random Transpositions and adjacent transpositions

3.6. Riffle shuffles


Week 4: Random Walks, electrical networks and hitting times

4.1. Networks and Reversible Markov Chains  

4.2. Harmonic Functions  

4.3. Voltages and Current Flows  

4.4. Computing effective resistance

4.5. Random Target Times  

4.6. Commute Time  

4.7. Hitting Times on Trees  

4.8. Hitting Times for Eulerian Graphs  

4.9. Hitting Times for the Torus  


Week 5: From hitting times to Cover Times

5.1. Bounding Mixing Times via Hitting Times  

5.2. The Matthews Method  

5.3. Spanning Tree Bound for Cover Time  

5.4. The Gaussian Free field

Week 6: Eigenvalues  

6.1. The Spectral Representation of a Reversible Transition Matrix  

6.2. The Relaxation Time  

6.3. Eigenvalues and Eigenfunctions of Some Simple Random Walks  

6.4. Spectral Formula for the Target Time  

6.5. Time Averages approximate space averages: MCMC  

6.6. Bounds on Spectral Gap via Contractions  


Week 7: Expansion of graphs and networks

7.1. The Dirichlet Form and the Bottleneck Ratio  

7.2. Comparison of Markov Chains  

7.3. Expander Graphs  

Week 8: The Transportation Metric and Path Coupling  

8.1. Path Coupling  

8.2. Rapid Mixing for Colorings  

8.3. Approximate Counting  

8.4. Curvature of Markov chains


Week 9: Simulation and perfect sampling

9.1. Monte Carlo sampling

9.2. unbiasing random bits

9.3. Glauber and Metropolis dynamics

9.4. Coupling from the past

9.5. sampling uniform spanning trees


Week 10: The Ising Model  

10.1. Fast Mixing at High Temperature  

10.2. The Complete Graph  

10.3. The Cycle  

10.4. The Tree  

10.5. Block Dynamics  

10.6. Lower Bound for Ising on Square  


Week 11: The Cutoff Phenomenon  and continuous time chains

11.1. The hypercube versus the cycle

11.2. The product condition and the Aldous-Pak examples

11.3. Equivalence of continuous time chains and lazy chains.

11.4. birth-death chains and cutoff on trees

11.5. Ramanujan graphs


Week 12:  Research directions

12.1. Universal lower bound for Ising, open for Potts

12.2. Swendsen -Wang Dynamics: the exponent quarter conjecture

12.3. Monotone subsets of the cube: The Ding-Mossel Theorem

12.4. Random walks on invertible matrices modulo 2

12.5. Random walk on percolation clusters and dynamical percolation


Lecturer Intro

Yuval Peres obtained his PhD  in 1990 from the Hebrew University, Jerusalem. He was a postdoctoral fellow at Stanford and Yale, and  was then a Professor of Mathematics and Statistics in Jerusalem and in Berkeley. Later, he was a Principal researcher at Microsoft. Yuval has published more than 350 papers in most areas of probability theory, including random walks, Brownian motion, percolation, and random graphs. He has co-authored  books on Markov chains, probability on graphs, game theory and Brownian motion, which can be found at https://www.yuval-peres-books.com/ . His presentations are available at https://yuval-peres-presentations.com/  Dr. Peres is a recipient of the Rollo Davidson prize and the Loeve prize. He has mentored 21 PhD students including Elchanan Mossel (MIT, AMS fellow),  Jian Ding (PKU, ICCM gold medal and Rollo Davidson prize),  Balint Virag and Gabor Pete (Rollo Davidson prize).

Dr. Peres was an invited speaker at the 2002 International Congress of Mathematicians in Beijing, at the 2008 European congress of Math,  and at the 2017 Math Congress of the Americas. In 2016, he was elected to the US National Academy of Science.


Lecturer Email: yperes@gmail.com

TA: Dr. Andrew Best, best@bimsa.cn


返回顶部
相关文章
  • Markov Chain Mixing

    Record: YesLevel: GraduateLanguage: EnglishPrerequisiteUndergraduate Probability and Linear Algebra, some measure theoretic probability: Conditional expectation, Laws of large numbers. The first chapter of Durrett’s graduate text or taking the course “Probability 1” given by Prof. Hao Wu at Tsinghua will provide ample background. That course can be taken in parallel with this course. The te...

  • Quantum speedup of Monte Carlo methods and Markov Chains

    AbstractSampling from a given distribution is a fundamental computational problem and has broad applications in statistics, machine learning, physics, etc. We systematically investigate the quantum speedup of Monte Carlo methods, quantum mean estimation, and fast-forwarding of reversible Markov chains. We develop quantum algorithms for sampling log-concave distributions (with density e^{-f(x)} ...