Academics

Adaptive Gradient Methods with Energy for Optimization Problems | Applied and Computational Math Colloquium

Time:Tues., 15:20-16:20, July 4, 2023

Venue:Lecture Hall, Floor 3 Jin Chun Yuan West Building

Organizer:应用与计算数学团队

Speaker:刘海亮 Hailiang Liu Iowa State University

Abstract

We propose AEGD, a new algorithm for gradient-based optimization of stochastic objective functions, based on adaptive updates of quadratic energy. The method is shown to be unconditionally energy stable, irrespective of the step size. In addition, AEGD enjoys tight convergence rates, yet allows a large step size. The method is straightforward to implement and requires little tuning of hyper-parameters. Experimental results demonstrate that AEGD works well for various optimization problems: it is robust with respect to initial data, capable of making rapid initial progress, shows comparable and most times better generalization performance than SGD with momentum for deep neural networks.


Speaker

Hailiang Liu is a Professor of Mathematics and Computer Science at the Iowa State University (ISU). He earned his Bachelor degree from Henan Normal University, Master degree from Tsinghua University, and Ph.D. degree from the Chinese Academy of Sciences, all in Mathematics. His research interests include analysis of partial differential equations, the development of high order numerical algorithms for solving these PDE problems, with diverse applications. He is the author of over 160 peer reviewed papers, and the recipient of many awards and honors, including the Alexander von Humboldt-Research Fellow, and the inaugural Holl Chair in Applied Mathematics at Iowa State University.

DATEJuly 4, 2023
SHARE
Related News
    • 0

      Efficient natural gradient method for large-scale optimization problems

      AbstractFirst-order methods are workhorses for large-scale optimization problems, but they are often agnostic to the structural properties of the problem under consideration and suffer from slow convergence, being trapped in bad local minima, etc. Natural gradient descent is an acceleration technique in optimization that takes advantage of the problem's geometric structure and preconditions the...

    • 1

      Applied and Computational Math Colloquium | Multiscale Modeling of Arctic Sea Ice Floes

      Abstract In this talk, I will start by presenting some quick facts about Arctic and Antarctic sea ice floes followed by a quick overview of the major sea ice continuum and particle models. I will then present our main contribution to its multiscale modelling.The recent Lagrangian particle model based on the discrete element method (DEM) has shown improved model performance and started to gain m...