清华主页 EN
导航菜单

Topics in Statistical Theory

来源: 02-21

时间:Tues. & Wed., 13:30-15:05, Feb.21 ~ May.10, 2023

地点: Venue: Lecture Hall, Floor 3, Jin Chun Yuan West Bldg.

主讲人:Prof. Yannis Yatracos

Description

This course will recover research results of the Instructor over the years:

1) Elegant nonparametric, minimum distance estimation of a density and a regression type function and of their derivatives, with upper and lower rates of convergence; the parameter space is assumed to be either totally bounded or regular. Plug-in upper convergence rates for estimates of a mixing density in Rd and for its derivatives.

2) Rates of convergence of estimates, Kolmogorov’s entropy and the dimensionality reduction principle in regression.

3) Pathologies of the Wasserstein distance in Statistical Inference.

4) Shrinkage of U-statistics obtained with artificial augmentation of the sample size. Pitman's closeness criterion and shrinkage estimates of the variance and the SD.

5) On Tukey’s poly-efficiency.

6) Pathologies of the Bootstrap.

7) Pathologies of the MLE, with correction using Model Updated MLE (MUMLE) with DECK-principle; D=Data E=Evolves, C=Creates, K=Knowledge. Relation of MUMLE with Wallace’s Minimum Message length method.

8) Additional topics if time permits.


Prerequisite:

A course in Mathematical Statistics and Probability, including modes of convergence of random variables/vectors.

返回顶部
相关文章
  • Statistical Theory

    Description: This course covers theoretical and applied fundamentals of statistical inference. The primary topics include principles of data reduction, point estimation, hypothesis testing, interval estimation and asymptotic methods.Prerequisite:Understand discrete and continuous random variables, transformations and expectations, common families of distributions, multiple random variables, di...

  • High-Dimensional Statistical Learning Theory

    Description:High-dimensional statistical learning has become an increasingly important research area. In this course, we will provide theoretical foundations of high-dimensional learning for several widely studied problems with many applications. More specifically, we will review concentration inequalities, VC dimension, metric entropy and statistical implications, consider high-dimensional lin...