Academics

Design of gradient flows for sampling probability distributions

Time:Mon., 9:00-11:00am, Nov. 20, 2023

Venue:Tencent Meeting:191-817-237

Speaker:Yifan Chen 陈钇帆 Courant institute,New York University

Abstract

Sampling a target distribution with an unknown normalization constant is a fundamental problem in computational science and engineering. Often, dynamical systems of probability densities are constructed to address this task, including MCMC and sequential Monte Carlo. Recently, gradient flows in the space of probability measures have been increasingly popular to generate this dynamical system. In this talk, we will discuss several rudimentary questions of this general methodology for sampling probability distributions. Any instantiation of a gradient flow for sampling needs an energy functional and a metric to determine the flow, as well as numerical approximations of the flow to derive algorithms. We first show how KL divergence is a special and unique energy functional that can facilitate the ease of numerical implementation of the flow. We then explain how the Fisher-Rao metric is an exceptional choice that leads to superior fast convergence of the flow. Finally, we discuss numerical approximations based on interacting particles, Gaussians, and mixtures to derive implementable algorithms for sampling.


Yifan Chen 陈钇帆

Courant institute,New York University

I am a Courant instructor in Courant Institute, New York University, starting from Septermber 2023. I received my Ph.D. at Caltech, advised by Profs. Thomas Y. Hou, Houman Owhadi and Andrew M. Stuart. My thesis focuses on multiscale and statistical numerical methods for computation and inference in PDEs and inverse problems. I obtained my B.S. in Pure and Applied Mathematics at Tsinghua University.

My research interests are at the interface between numerical analysis, randomized algorithms, and sampling/optimization, for heterogeneous or high-dimensional computation and prediction problems in PDEs, imaging, and data science.

Personal Homepage:

https://yifanc96.github.io/


DATENovember 20, 2023
SHARE
Related News
    • 0

      Data-driven optimization --- Integrating data sampling, learning, and optimization

      Abstract:Traditionally machine learning and optimization are two different branches in computer science. They need to accomplish two different types of tasks, and they are studied by two different sets of domain experts. Machine learning is the task of extracting a model from the data, while optimization is to find the optimal solutions from the learned model. In the current era of big data and...

    • 1

      Principal flows, sub-manifolds and boundarie

      Abstract While classical statistics has dealt with observations which are real numbers or elements of a real vector space, nowadays many statistical problems of high interest in the sciences deal with the analysis of data which consist of more complex objects, taking values in spaces which are naturally not (Euclidean) vector spaces but which still feature some geometric structure. I will discu...