清华主页 EN
导航菜单

High-Dimensional Statistical Learning Theory

来源: 09-18

时间:Mon. & Tues., 9:50-11:25am, Sept. 18-Dec. 20, 2023 (excluding the week of Nov. 27)

地点:Lecture Hall B725, Tsinghua University Shuangqing Complex Building A(清华大学双清综合楼A座B725报告厅)

主讲人:Yuhong Yang 杨宇红

Description:

High-dimensional statistical learning has become an increasingly important research area. In this course, we will provide theoretical foundations of high-dimensional learning for several widely studied problems with many applications. More specifically, we will review concentration inequalities, VC dimension, metric entropy and statistical implications, consider high-dimensional linear modeling and extensions (e.g., additive modeling and interaction learning), derive minimax theories for parametric and nonparametric function estimation, and study statistical properties of model selection methods.


Biography:

Dr. Yuhong Yang is Professor at Yau Mathematical Sciences Center. He received his Ph.D. in statistics from Yale University in 1996. His research interests include model selection, model averaging, multi-armed bandit problems, causal inference, high-dimensional data analysis, and machine learning. He has published in journals in several fields including Annals of Statistics, JASA, IEEE Transactions on Information Theory, IEEE Signal Processing Magazine, Journal of Econometrics, Journal of Machine Learning Research, and International Journal of Forecasting. He is a recipient of the US NSF CAREER Award and a fellow of the Institute of Mathematical Statistics. He is included in the list of top 2% of the world's most cited scientists by Stanford University.

返回顶部
相关文章
  • Manifold learning for noisy and high-dimensional datasets: challenges and some solutions

    Abstract:Manifold learning theory has garnered considerable attention in the modeling of expansive biomedical datasets, showcasing its ability to capture data essence more effectively than traditional linear methodologies. Nevertheless, prevalent algorithms are primarily designed for low-dimensional and clean datasets, whereas contemporary biomedical datasets tend to be high-dimensional and no...

  • Covariate-shift Robust Adaptive Transfer Learning for High-Dimensional Regression

    AbstractThe main challenge that sets transfer learning apart from traditional supervised learning is the distribution shift, reflected as the shift between the source and target models and that between the marginal covariate distributions. High-dimensional data introduces unique challenges, such as covariate shifts in the covariate correlation structure and model shifts across individual featur...