Academics

Trace optimization and eigenvector-dependent nonlinear eigenvalue problems in data science

Time:Thur., 14:00-15:00, Nov.24th, 2022

Venue:Tencent Meeting ID: 431642438

Speaker:Leihong Zhang 张雷洪 Soochow University

Abstract

Some recent applications of multivariate statistical analysis in data science need to optimize certain trace-related objective functions over the orthogonal constraints. In this talk, we shall first present some recent applications in data science and show that solving the optimization problems can be converted to eigenvector-dependent eigenvalue problems (NEPv) for which the self-consistent filed (SCF) iteration can be effectively applied. We then discuss recent developments of the general SCF on the local convergence rate and the level-shifted technique.


Speaker

从事最优化理论与计算、数值线性代数、模式识别、数据挖掘等领域的研究。主持多项国家自科项目,参与国家自然科学基金重大研究计划。在《Math Program》《Math. Comput.》《Numer. Math.》《IEEE TPAMI》以及SIAM期刊系列等发表六十多篇学术论文。

DATENovember 24, 2022
SHARE
Related News
    • 0

      Inverse Problems for Some Nonlinear PDEs with Partial Data

      AbstractIn this talk, I will demonstrate the higher order linearization approach to solve several inverse boundary value problems for nonlinear PDEs, modeling for example nonlinear optics, including nonlinear magnetic Schrodinger equation and time-dependent Schrodinger equation. Considering partial data problems, the problem will be reduced to solving for the coefficient functions from their in...

    • 1

      Efficient natural gradient method for large-scale optimization problems

      AbstractFirst-order methods are workhorses for large-scale optimization problems, but they are often agnostic to the structural properties of the problem under consideration and suffer from slow convergence, being trapped in bad local minima, etc. Natural gradient descent is an acceleration technique in optimization that takes advantage of the problem's geometric structure and preconditions the...