清华主页 EN
导航菜单

Two generalization of the Sarkisov Program

来源: 09-26

时间:Fri., 15:30-16:30, Sept. 27, 2024

地点:Zoom Meeting ID: 262 865 5007 Passcode: YMSC 线下:ShuangQing B7

组织者:Caucher Birkar,贾甲

主讲人:Yang He

Algebraic Geometry Seminar

Organizers:

Caucher Birkar,贾甲

Speaker:

Yang He (BIMSA)

Time:

Fri., 15:30-16:30, Sept. 27, 2024

Online:

Zoom Meeting ID: 262 865 5007

Passcode: YMSC 线下:ShuangQing B7

Title:

Two generalization of the Sarkisov Program

Abstract:

I will explain how to generalize the Sarkisov Program following the idea of Shokurov, and discuss some applications and computations in the surface case. Finally, I will present some ongoing work, joint with Caucher Birkar and Artan Sheshmani, about another generalization of the Sarkisov Program using the idea of mirror symmetry and moduli of Landau-Ginzburg models.

返回顶部
相关文章
  • Favorite sites for simple random walk in two and more dimensions

    YMSC Probability SeminarOrganizers:吴昊,杨帆,姜建平,顾陈琳Speaker:Yushu Zheng 郑玉书 (AMSS)Time:Thur., 3:30-4:30 pm, Sept. 26, 2024Venue:C548, Shuangqing Complex Building A清华大学双清综合楼A座 C548Title:Favorite sites for simple random walk in two and more dimensionsAbstract:On the trace of a discrete-time simple random walk on $\mathbb{Z}^d$ for $d\geq 2$, we consider the evolution of...

  • Optimization, Generalization and Implicit bias of Gradient Methods in Deep Learning

    Abstract: Deep learning has enjoyed huge empirical success in recent years. Although training a deep neural network is a highly nonconvex optimization problem,simple (stochastic) gradient methods are able to produce good solutions that minimize the training error, and more surprisingly, can generalize well to out-of sample data, even when the number of parameters is significantly larger than t...