Academics

Neural Networks: A Perspective from Numerical Analysis

Time:Thur., 15:00-16:00 April 11, 2024

Venue:腾讯会议:815-642-712

Speaker:Juncai He (King Abdullah University of Science and Technology)

Speaker


y research focuses on mathematical analysis, algorithm development, and their applications in machine learning and scientific computing, spanning both data and physical sciences. My Ph.D. training was grounded in classical numerical methods for partial differential equations (PDEs), with a particular emphasis on finite element methods (FEM) and multigrid methods. Armed with this solid foundation in numerical PDEs and scientific computing, my primary research objective is to explore deep learning models and algorithms through the lens of numerical PDEs and geometry. This approach aims to foster a comprehensive understanding and innovative advancement of these models, covering theoretical foundations, algorithmic strategies, and practical applications. From my Ph.D. program to the present, and continuing into the foreseeable future, my research efforts and aspirations are principally centered on three interrelated themes:


· Mathematical analysis of deep neural networks (DNNs) from a finite element perspective;


· Development of theories, algorithms, and applications for convolutional neural networks (CNNs) and Transformers, drawing inspiration from multigrid structures;


· Investigation into the learning of data with low-dimensional structures.



Abstract:

In this talk, we will present recent results on the theories, algorithms, and applications of deep neural networks (DNNs) from a numerical analysis perspective. First, we will illustrate the connections between linear finite elements and ReLU DNNs, as well as spectral methods and ReLU^k DNNs. Second, we will show our latest findings regarding the open question of whether DNNs can precisely recover piecewise polynomials of arbitrary order on any simplicial mesh in any dimension. Then, inspired by the multigrid structure in numerical PDEs, we will discuss a unified framework for convolutional neural networks (CNNs) and multigrid methods, known as MgNet. Additionally, we will showcase recent advancements in the theories and applications of MgNet, particularly the first approximation result for CNNs with 2D inputs and an efficient operator learning framework, MgNO.

DATEApril 10, 2024
SHARE
Related News
    • 0

      Conductivity Imaging using Deep Neural Networks

      Speaker Dr. Bangti Jin received his PhD degree in applied mathematics from the Chinese University of Hong Kong, Hong Kong, in 2008. Currently he is a professor of mathematics at Department of Mathematics, The Chinese University of Hong Kong. Previously he was a lecturer, reader and professor of inverse problems at Department of Computer Science, University College London (2014-2022), an assista...

    • 1

      Modularization for brain and brain-inspired neural networks

      AbstractBrain is the most complex part of our body. It is not only the actional control center but also the source of intelligence and consciousness. To support the complex biochemical dynamics and information processing, the brain network exhibits modular organization in multiple scales and modalities. While such modularization patterns may be defined by clustering techniques, it is still uncl...