清华主页 EN
导航菜单

Optimization, Sampling, and Generative Modeling in Non-Euclidean Spaces

来源: 06-12

时间:Thursday, 10:00-12:00 June 13, 2024

地点:腾讯会议:242-361-320

主讲人: Molei Tao 陶默雷 Georgia Institute of Technology

Molei Tao 陶默雷

Georgia Institute of Technology

Molei Tao received B.S. in Math & Physics in 2006 from Tsinghua Univ. and Ph.D. in Control & Dynamical Systems with a minor in Physics in 2011 from Caltech. Afterwards, he worked as a postdoc in Computing & Mathematical Sciences at Caltech from 2011 to 2012, and then as a Courant Instructor at NYU from 2012 to 2014. From 2014 on, he has been an assistant, and then associate professor in School of Math at Georgia Tech. He is a recipient of W.P. Carey Ph.D. Prize in Applied Mathematics (2011), American Control Conference Best Student Paper Finalist (2013), NSF CAREER Award (2019), AISTATS Best Paper Award (2020), IEEE EFTF-IFCS Best Student Paper Finalist (2021), Cullen-Peck Scholar Award (2022), GT-Emory AI.Humanity Award (2023), a Plenary Speaker at Georgia Scientific Computing Symposium (2024), a Keynote Speaker at (2024) International Conference on Scientific Computing and Machine Learning, and SONY Faculty Innovation Award (2024).


Abstract

Machine learning in non-Euclidean spaces have been rapidly attracting attention in recent years, and this talk will give some examples of progress on its mathematical and algorithmic foundations. A sequence of developments that eventually leads to non-Euclidean generative modeling will be reported.

More precisely, I will begin with variational optimization, which, together with delicate interplays between continuous- and discrete-time dynamics, enables the construction of momentum-accelerated algorithms that optimize functions defined on manifolds. Selected applications, namely a generic improvement of Transformer, and a low-dim. approximation of high-dim. optimal transport distance, will be described. Then I will turn the optimization dynamics into an algorithm that samples from probability distributions on Lie groups. If time permits, the performance of this sampler will also be quantified, without log-concavity condition or its common relaxations. Finally, I will describe how this sampler can lead to a structurally-pleasant diffusion generative model that allows users to, given training data that follow any latent statistical distribution on a Lie group, generate more data exactly on the same manifold that follow the same distribution.

返回顶部
相关文章
  • Data-driven optimization --- Integrating data sampling, learning, and optimization

    Abstract:Traditionally machine learning and optimization are two different branches in computer science. They need to accomplish two different types of tasks, and they are studied by two different sets of domain experts. Machine learning is the task of extracting a model from the data, while optimization is to find the optimal solutions from the learned model. In the current era of big data and...

  • Homeomorphisms of Euclidean space

    Abstract:The topological group of homeomorphisms of d-dimensional Euclidean space is a basic object in geometric topology, closely related to understanding the difference between diffeomorphisms and homeomorphisms of all d-dimensional manifolds (except when d=4). I will explain some methods that have been used for studying the algebraic topology of this group, and report on a recently obtained...