Academics

[BIMSA-Tsinghua Seminar on Machine Learning and Differential Equations] From Neural PDEs to Neural Operators: Blending data and physics for fast predictions

Time:2022-09-08 8:50-12:15 am

Venue:Venue: 1129B zoom: 537 192 5549 Password: BIMSA

Speaker:George Em Karniadakis

Abstract:

We will review physics-informed neural network and summarize available extensions for applications in computational mechanics and beyond. We will also introduce new NNs that learn functionals and nonlinear operators from functions and corresponding responses for system identification. The universal approximation theorem of operators is suggestive of the potential of NNs in learning from scattered data any continuous operator or complex system. We first generalize the theorem to deep neural networks, and subsequently we apply it to design a new composite NN with small generalization error, the deep operator network (DeepONet), consisting of a NN for encoding the discrete input function space (branch net) and another NN for encoding the domain of the output functions (trunk net). We demonstrate that DeepONet can learn various explicit operators, e.g., integrals, Laplace transforms and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. More generally, DeepOnet can learn multiscale operators spanning across many scales and trained by diverse sources of data simultaneously.


Speaker Intro.:

George Karniadakis is from Crete. He is a member of the National Academy of Engineering and a Vannvar Bush Faculty Fellow. He received his S.M. and Ph.D. from Massachusetts Institute of Technology (1984/87). He was appointed Lecturer in the Department of Mechanical Engineering at MIT and subsequently he joined the Center for Turbulence Research at Stanford / Nasa Ames. He joined Princeton University as Assistant Professor in the Department of Mechanical and Aerospace Engineering and as Associate Faculty in the Program of Applied and Computational Mathematics. He was a Visiting Professor at Caltech in 1993 in the Aeronautics Department and joined Brown University as Associate Professor of Applied Mathematics in the Center for Fluid Mechanics in 1994. After becoming a full professor in 1996, he continued to be a Visiting Professor and Senior Lecturer of Ocean/Mechanical Engineering at MIT. He is an AAAS Fellow (2018-), Fellow of the Society for Industrial and Applied Mathematics (SIAM, 2010-), Fellow of the American Physical Society (APS, 2004-), Fellow of the American Society of Mechanical Engineers (ASME, 2003-) and Associate Fellow of the American Institute of Aeronautics and Astronautics (AIAA, 2006-). He received the SIAM/ACM Prize on Computational Science & Engineering (2021), the Alexander von Humboldt award in 2017, the SIAM Ralf E Kleinman award (2015), the J. Tinsley Oden Medal (2013), and the CFD award (2007) by the US Association in Computational Mechanics. His h-index is 123 and he has been cited about 70,000 times.

DATESeptember 5, 2022
SHARE
Related News
    • 0

      Learning Nonlocal Constitutive Models with Neural Operators

      AbstractConstitutive models are widely used for modeling complex systems in science and engineering, when first-principle-based, well-resolved simulations are prohibitively expensive. For example, in fluid dynamics, constitutive models are required to describe nonlocal, unresolved physics such as turbulence and laminar-turbulent transition. However, traditional constitutive models based on PDEs...

    • 1

      Learning constitutive models with neural networks

      AbstractIn this talk, I will introduce some work of learning constitutive equations in fluid mechanics and geophysics based on machine learningSpeaker Intro熊繁升,现任北京雁栖湖应用数学研究院助理研究员,曾任北京应用物理与计算数学研究所所聘博士后。先后毕业于中国地质大学(北京)、清华大学,美国耶鲁大学联合培养博士。研究兴趣主要集中于基于机器学习算法(DNN、PINN、DeepONet等)求解微分方程模型正/反问题...