清华主页 EN
导航菜单

Recent advance on Nesterov acceleration

来源: 11-24

时间:Thur., 16:00-17:30, Nov. 24th, 2022

地点:Online Tencent ID:410-207-317 Join the meeting: https://meeting.tencent.com/dm/RpLiN266oVS7

组织者:应用与计算数学团队

主讲人:Bin Shi (Academy of Mathematics and Systems Science, Chinese Academy of Sciences)

Abstract

Nesterov's accelerated gradient descent (NAG) is one of the milestones in the history of first-order algorithms. Until recently, it was not successfully uncovered by the high-resolution differential equation framework in [Shi et al., 2021] that the mechanism behind the acceleration phenomenon is due to the gradient correction term. Along this way, I present some recent advances about the high-resolution differential equation framework with focusing on the implicit-velocity scheme and proximal scheme.


Speaker

史斌,本科毕业于中国海洋大学数学系,之后分别在复旦大学和美国麻省大学达特茅斯分校获得基础数学和理论物理的硕士学位,2018年在美国佛罗里达国际大学获得计算机科学博士学位。2019年至2021年在美国加州大学伯克利分校跟随机器学习的先驱Michael I. Jordan教授从事博士后研究工作,2021年6月入职中国科学院数学与系统科学研究院,任副研究员。

返回顶部
相关文章
  • Recent theoretical advances in non convex optimization

    AbstractIn this report, we give an overview of recent theoretical results on global performance guarantees of optimization algorithm for non convex optimization about deep neural network and data analysis

  • Recent developments in Seiberg-Witten theory

    Description:In 1994, Witten [12] introduced a non-linear partial differential equation on a 4-manifold, called the Seiberg-Witten equations today. This PDE has brought significant progresses in 4-dimensional topology and geometry. In this series of lectures, I shall start with the basics of Seiberg-Witten theory and survey some of rather recent developments in Seiberg-Witten theory.The first le...