Academics

Recent advance on Nesterov acceleration

Time:Thur., 16:00-17:30, Nov. 24th, 2022

Venue:Online Tencent ID:410-207-317 Join the meeting: https://meeting.tencent.com/dm/RpLiN266oVS7

Organizer:应用与计算数学团队

Speaker:Bin Shi (Academy of Mathematics and Systems Science, Chinese Academy of Sciences)

Abstract

Nesterov's accelerated gradient descent (NAG) is one of the milestones in the history of first-order algorithms. Until recently, it was not successfully uncovered by the high-resolution differential equation framework in [Shi et al., 2021] that the mechanism behind the acceleration phenomenon is due to the gradient correction term. Along this way, I present some recent advances about the high-resolution differential equation framework with focusing on the implicit-velocity scheme and proximal scheme.


Speaker

史斌,本科毕业于中国海洋大学数学系,之后分别在复旦大学和美国麻省大学达特茅斯分校获得基础数学和理论物理的硕士学位,2018年在美国佛罗里达国际大学获得计算机科学博士学位。2019年至2021年在美国加州大学伯克利分校跟随机器学习的先驱Michael I. Jordan教授从事博士后研究工作,2021年6月入职中国科学院数学与系统科学研究院,任副研究员。

DATENovember 24, 2022
SHARE
Related News
    • 0

      A brief introduction to convergence acceleration algorithms

      AbstractIn this talk, I will give a brief introduction to convergence acceleration algorithms. First, I will provide a concise overview of the research context and the current state of convergence acceleration algorithms. Subsequently, I will present a method known as the kernel methodology for constructing convergence acceleration algorithms, based on which I propose a novel acceleration algor...

    • 1

      Deep image prior for inverse problems: acceleration and probabilistic treatment

      AbstractSince its first proposal in 2018, deep image prior has emerged as a very powerful unsupervised deep learning technique for solving inverse problems. The approach has demonstrated very encouraging empirical success in image denoising, deblurring, super-resolution etc. However, there are also several known drawbacks of the approach, notably high computational expense. In this talk, we des...