清华主页 EN
导航菜单

Bayesian machine learning

来源: 09-22

时间: 17:05 - 18:40 , Thu , 10/27/2022 - 1/12/2023

地点:Venue: 1110 Online: : 928 682 9093 PW: BIMSA

主讲人: Alexey Zaytsev (Visiting Assistant Research Fellow)

Record: Yes

Level: Graduate

Language: English


Prerequisite

Probability theory, Mathematical statistics, Machine learning


Abstract

Probabilistic approach in machine and deep learning leads to principled solutions. It provides explainable decisions and new ways for improving of existing approaches. Bayesian machine learning consists of probabilistic approaches that rely on Bayes formula. It can help in numerous applications and has beautiful mathematical concepts behind. In this course, I will describe the foundations of Bayesian machine learning and how it works as a part of deep learning framework.


Reference

The topic of the course is on the edge of the current advances in the field. We would provide a list of articles and books for each particular lecture. Good starting points are:
1.    C. Bishop “Pattern Recognition and Machine learning”, 2006. Blocks 1 and 2.
2.    C. Rasmussen “Gaussian processes for Machine learning”, 2005. Block 3

Tools: Notebook with access to colab.google.com


Syllabus

Block 1: Basics of Bayesian approach
Lecture 1. Reminder of mathematical statistics. Maximum likelihood approach. Bayesian linear regression
Lecture 2. Exponential family of distributions. Conjugate priors.
Lecture 3. Bayesian logistic regression. Laplace approximation.
Lecture 4. Bernstein-von-Mises theorem. Non-informative and reference priors.
Block 2: Approximate inference
Lecture 5. Variational inference. ELBO lower bound.
Lecture 6. Normalizing flows. Expectation propagation.
Lecture 7. Sampling problem statement. Importance sampling
Lecture 8. Monte-Carlo sampling. MCMC. Metropolis–Hastings algorithm. Hamiltonian Monte-Carlo.
Block 3: Gaussian process models
Lecture 9. Gaussian process regression. Exact inference scheme. Connection to RKHS space
Lecture 10. Approximate Generalized Gaussian process models. Heteroscedasticity modeling. Efficient Gaussian process regression. Fourier features. Nystrom approximation.
Lecture 11. Risk estimation for Gaussian process regression. Parametric and non-parametric approaches.
Block 4: Bayesian neural networks
Lecture 12. Neural networks basics. Bayesian dropout.
Lecture 13. Uncertainty estimation in machine learning: Bayesian and non-Bayesian methods
Lecture 14. Loss surfaces for deep neural networks.
Block 5: Point processes
Lecture 15. Basics: Poisson processes, non-homogenous Poisson process. Maximum likelihood estimation
Lecture 16. Hawkes process. Deep Hawkes processes based on RNNs and Transformers.


Lecturer Intro

Alexey has deep expertise in machine learning and processing of sequential data. He publishes at top venues, including KDD, ACM Multimedia and AISTATS. Industrial applications of his results are now in service at companies Airbus, Porsche and Saudi Aramco among others.


Lecturer Email: A.Zaytsev@skoltech.ru

TA: Dr. Miao He, paris.muse7@gmail.com


返回顶部
相关文章
  • Probabilistic machine learning

    IntroductionProbabilistic approach in machine and deep learning leads to principled solutions. It provides explainable decisions and new ways for improving of existing approaches. Bayesian machine learning consists of probabilistic approaches that rely on Bayes formula. It can help in numerous applications and has beautiful mathematical concepts behind. In this course, I will describe the found...

  • Machine Learning for Theoretical Physics

    PrerequisiteElementary multivariate calculus, elementary statistics. Some basic General Relativity and Statistical Mechanics may help in following the applications.AbstractThe course is targeted to those who know beginning graduate level physics but do not know machine learning. We will cover important methods in machine learning with a view to their applications to current physics such as stri...