Abstract
Sampling a target distribution with an unknown normalization constant is a fundamental problem in computational science and engineering. Often, dynamical systems of probability densities are constructed to address this task, including MCMC and sequential Monte Carlo. Recently, gradient flows in the space of probability measures have been increasingly popular to generate this dynamical system. In this talk, we will discuss several rudimentary questions of this general methodology for sampling probability distributions. Any instantiation of a gradient flow for sampling needs an energy functional and a metric to determine the flow, as well as numerical approximations of the flow to derive algorithms. We first show how KL divergence is a special and unique energy functional that can facilitate the ease of numerical implementation of the flow. We then explain how the Fisher-Rao metric is an exceptional choice that leads to superior fast convergence of the flow. Finally, we discuss numerical approximations based on interacting particles, Gaussians, and mixtures to derive implementable algorithms for sampling.
Yifan Chen 陈钇帆
Courant institute,New York University
I am a Courant instructor in Courant Institute, New York University, starting from Septermber 2023. I received my Ph.D. at Caltech, advised by Profs. Thomas Y. Hou, Houman Owhadi and Andrew M. Stuart. My thesis focuses on multiscale and statistical numerical methods for computation and inference in PDEs and inverse problems. I obtained my B.S. in Pure and Applied Mathematics at Tsinghua University.
My research interests are at the interface between numerical analysis, randomized algorithms, and sampling/optimization, for heterogeneous or high-dimensional computation and prediction problems in PDEs, imaging, and data science.
Personal Homepage:
https://yifanc96.github.io/