﻿ 倒计时3天 | 5位顶尖科学家领衔主讲基础科学报告-求真书院

### 倒计时3天 | 5位顶尖科学家领衔主讲基础科学报告

Basic Science Lecture

Venue：LH A (A2), BIMSA

7月17日（星期一）上午 9:00-10:00

1974年菲尔兹奖得主，2008年沃尔夫奖得主，2006年邵逸夫奖得主，美国国家科学院院士，英国皇家科学院外籍院士，美国哈佛大学和布朗大学教授

Title

Consciousness, robots and DNA

Abstract

Consciousness was not even considered a scientific term until recently. But with the prospect of fully intelligent humanoid robots, one must confront the question of whether they are conscious. This question has many sides but here I want to focus on what physics has to say and, in particular, whether DNA creates cat-states.

7月17日（星期一）上午 10:00-11:00

2002年图灵奖得主，以色列科学院院士，美国国家科学院院士，美国艺术与科学院院士，英国皇家科学院外籍院士，法国科学院院士，以色列魏茨曼科学研究所教授

Title

Manifolds in Machine Learning

Abstract

The talk should be accessible and of interest to both computer scientists and mathematicians, and in fact, this research was inspired by a talk by Prof. Shing-Tung Yau about the geometry of machine learning that I attended a few years ago.

7月17日（星期一）上午 11:20-12:20

2018年菲尔兹奖得主，欧洲科学院院士，瑞士苏黎世联邦理工大学教授

Title

Generic regularity of free boundaries for the obstacle problem

Abstract

The classical obstacle problem consists of finding the equilibrium position of an elastic membrane whose boundary is held fixed and which is constrained to lie above a given obstacle. By classical results of Caffarelli, the free boundary is $C^\infty$ outside a set of singular points. Explicit examples show that the singular set could be in general (n−1)-dimensional — that is, as large as the regular set. In a recent paper with Ros-Oton and Serra we show that, generically, the singular set has zero $H^{n-4}$ measure (in particular, it has codimension three inside the free boundary), solving a conjecture of Schaeffer in dimension $n \leq 4$. The aim of this talk is to give an overview of these results.

7月18日（星期二）下午 17:00-18:00

2004年诺贝尔物理学奖得主，美国国家科学院院士，欧洲科学院院士，中国科学院外籍院士，美国加利福尼亚大学圣塔芭芭拉分校教授

Venue：Chaoyang Kexie Blue Hall

7月23日（星期日）下午 16:00-16:45

Greg Yang 杨格

Title

The unreasonable effectiveness of mathematics in large scale deep learning

Abstract

Recently, the theory of infinite-width neural networks led to the first technology, muTransfer, for tuning enormous neural networks that are too expensive to train more than once. For example, this allowed us to tune the 6.7 billion parameter version of GPT-3 using only 7% of its pretraining compute budget, and with some asterisks, we get a performance comparable to the original GPT-3 model with twice the parameter count. In this talk, I will explain the core insight behind this theory. In fact, this is an instance of what I call the *Optimal Scaling Thesis*, which connects infinite-size limits for general notions of “size” to the optimal design of large models in practice. I'll end with several concrete key mathematical research questions whose resolutions will have incredible impact on the future of AI.

• ### 倒计时2天 | 院士领衔大会报告

ICBS Lecture国际基础科学大会报告首届国际基础科学大会（International Congress of Basic Science, 简称 ICBS）将于2023年7月16-28日在北京举行，主题为“聚焦基础科学，引领人类未来”。大会期间，约350场前沿科学奖报告（Frontiers of Science Award Lecture）、大会报告（Plenary Lecture）以及特邀报告（Invited Lecture）将在北京雁栖湖应用数学研究院举行。多位院士领衔主讲大会报告美国科学院院士、美国国家工程院院士...

• ### 倒计时1天 | 国际学术大奖得主报告

ICBS Lecture 国际基础科学大会报告首届国际基础科学大会（International Congress of Basic Science,简称 ICBS）将于2023年7月16日-28日在北京举行，主题为“聚焦基础科学，引领人类未来”。大会期间，约350场前沿科学奖报告（Frontiers of Science Award Lecture）、大会报告（Plenary Lecture）以及特邀报告（Invited Lecture）将在北京雁栖湖应用数学研究院举行。多位国际学术大奖得主领衔主讲大会报告2018年菲尔兹奖得主、...