Academics

Bayesian machine learning

Time: 17:05 - 18:40 , Thu , 10/27/2022 - 1/12/2023

Venue:Venue: 1110 Online: : 928 682 9093 PW: BIMSA

Speaker: Alexey Zaytsev (Visiting Assistant Research Fellow)

Record: Yes

Level: Graduate

Language: English


Prerequisite

Probability theory, Mathematical statistics, Machine learning


Abstract

Probabilistic approach in machine and deep learning leads to principled solutions. It provides explainable decisions and new ways for improving of existing approaches. Bayesian machine learning consists of probabilistic approaches that rely on Bayes formula. It can help in numerous applications and has beautiful mathematical concepts behind. In this course, I will describe the foundations of Bayesian machine learning and how it works as a part of deep learning framework.


Reference

The topic of the course is on the edge of the current advances in the field. We would provide a list of articles and books for each particular lecture. Good starting points are:
1.    C. Bishop “Pattern Recognition and Machine learning”, 2006. Blocks 1 and 2.
2.    C. Rasmussen “Gaussian processes for Machine learning”, 2005. Block 3

Tools: Notebook with access to colab.google.com


Syllabus

Block 1: Basics of Bayesian approach
Lecture 1. Reminder of mathematical statistics. Maximum likelihood approach. Bayesian linear regression
Lecture 2. Exponential family of distributions. Conjugate priors.
Lecture 3. Bayesian logistic regression. Laplace approximation.
Lecture 4. Bernstein-von-Mises theorem. Non-informative and reference priors.
Block 2: Approximate inference
Lecture 5. Variational inference. ELBO lower bound.
Lecture 6. Normalizing flows. Expectation propagation.
Lecture 7. Sampling problem statement. Importance sampling
Lecture 8. Monte-Carlo sampling. MCMC. Metropolis–Hastings algorithm. Hamiltonian Monte-Carlo.
Block 3: Gaussian process models
Lecture 9. Gaussian process regression. Exact inference scheme. Connection to RKHS space
Lecture 10. Approximate Generalized Gaussian process models. Heteroscedasticity modeling. Efficient Gaussian process regression. Fourier features. Nystrom approximation.
Lecture 11. Risk estimation for Gaussian process regression. Parametric and non-parametric approaches.
Block 4: Bayesian neural networks
Lecture 12. Neural networks basics. Bayesian dropout.
Lecture 13. Uncertainty estimation in machine learning: Bayesian and non-Bayesian methods
Lecture 14. Loss surfaces for deep neural networks.
Block 5: Point processes
Lecture 15. Basics: Poisson processes, non-homogenous Poisson process. Maximum likelihood estimation
Lecture 16. Hawkes process. Deep Hawkes processes based on RNNs and Transformers.


Lecturer Intro

Alexey has deep expertise in machine learning and processing of sequential data. He publishes at top venues, including KDD, ACM Multimedia and AISTATS. Industrial applications of his results are now in service at companies Airbus, Porsche and Saudi Aramco among others.


Lecturer Email: A.Zaytsev@skoltech.ru

TA: Dr. Miao He, paris.muse7@gmail.com


DATESeptember 22, 2022
SHARE
Related News
    • 0

      Quiver Approaches to Machine Learning

      Record: YesLevel: GraduateLanguage: EnglishPrerequisiteSome representation theory or algebraic geometry would be helpful.AbstractQuiver representations are a useful tool in diverse areas of algebra and geometry. Recently, they have furthermore been used to describe and analyze neural networks. I will introduce quivers, their representations, and a range of applications, including to the theory ...

    • 1

      Bayesian Statistics

      Fabrizio Ruggeri(CNR)President-Elect of the International Statistical Institute (ISI), Senior Fellow at the Italian National Research Council, Elected Fellow of ISI and Fellow of American Statistical Association, Institute of Mathematical Statistics and International Society for Bayesian Analysis (which gave him the first Zellner Medal), author of 200+ articles and 6 books.TimeTues. & Thur., ...