Academics

Complex methods in General Relativity

Time:2023-02-28 ~ 2023-06-20 Tue 15:20 - 17:50

Venue:Room 清华近春园西楼报告厅 ZOOM: 559 700 6085 PW: BIMSA

Speaker:Lars Andersson

Abstract

Following a brief introduction to complex manifolds, the course will explore connections between general relativity, the Einstein equation,and complex geometry. Among the topics I would like to cover are manifestations of Kähler and Hermitian geometry in general relativity, as well as twistor theory and its applications in general relativity and field theory. The course will be accessible to advanced undergraduates with a good background in differential geometry. The format of the course will be a seminar, with coordinated lectures by participants. Examination will be in the form of presentations in the context of the course.


Lecturer Intro.

Lars Andersson is a BIMSA research fellow. Before joining BIMSA he held professorships at the Royal Institute of Technology, Stockholm, the University of Miami, and led a research group at the Albert Einstein Institute, Potsdam. He works on problems in general relativity, mathematical physics and differential geometry, and has contributed to the mathematical analysis of cosmological models, apparent horizons, and self-gravitating elastic bodies. The recent research interests of Lars Andersson include the black hole stability problem, gravitational instantons, and the gravitational spin Hall effect.

DATEFebruary 28, 2023
SHARE
Related News
    • 0

      Methods of Algebraic Topology in Graph Theory

      IntroductionCurrently the problem of transferring results of algebraic topology to discrete objects and, in particular, to various categories of digraphs and graphs, is widely investigated. The main technical tools are given by the homotopy theory and various homology theories. We present the basic methods of algebraic topology in graph theory and describe relations between continious and discr...

    • 1

      Optimization Methods for Machine Learning

      IntroductionStochastic Gradient Descent (SGD), in one form or another, serves as the workhorse method for training modern machine learning models. Amidst its myriad variations, the SGD domain is both extensive and burgeoning, presenting a significant challenge for both practitioners and even experts to understand its landscape and inhabitants. This course offers a mathematically rigorous and co...