You are here

Bayesian inference I

External Lecturer: 
Guido Sanguinetti
Course Type: 
PhD Course
Academic Year: 
October - December
36 h
CFU (LM): 

Each lecture requires 2 hrs; 12 lectures + 4 labs = 36 hours (7 weeks: 19/10 - 4/12)

  1. The multivariate Gaussian distribution: conditionals, marginals, and conjugate prior (and its problems)
  2. Laplace method and Fisher matrix
  3. Linear/ Gaussian models: probabilistic PCA and linear regression. Basis function regression.
  4. Gaussian processes for regression and Bayesian Optimization.
  5. Lab 1: linear regression and Gaussian Processes
  6. Bayesian inference in non-conjugate models: Markov Chain Monte Carlo (MCMC), rejection and importance sampling, Metropolis-Hastings algorithm. Convergence diagnostics and rules of thumb.
  7. Generalised linear models (GLMs) and inference; Gaussian processes for classification.
  8. Lab 2: Bayesian GLMs.
  9. Graphical models and hierarchical Bayesian models. Gibbs sampling.
  10. Mixture models and topic models.
  11. Variable augmentation: probit and logistic regression with auxiliary variables
  12. Lab 3: Gibbs sampling for mixture models.
  13. Variational inference: prelude, the EM algorithm
  14. Mean-field variational inference
  15. Variational inference for general models: black-box variational inference and variational autoencoders, Stein variational inference.
  16. Lab 4: Variational mean field for mixture models.

Please, notice that this is a course belonging to Data Science Excellence Department programme. MAMA PhD students can plan 33% of their credits (i.e. 50 hrs) from this programme.

Next Lectures: 

Sign in