MENU

You are here

Information Theory, Spin Glasses and Inference

External Lecturer: 
Jean Barbier
Course Type: 
PhD Course
Academic Year: 
2021-2022
Period: 
Second term
Duration: 
36 h
Description: 

Course description: Information theory, high-dimensional statistics and Bayesian inference form the power-house of modern information processing: communications, signal processing, machine learning etc. This theoretical course will introduce state-of-the-art methods of analysis and algorithms for paradigmatic models of inference in the challenging high-dimensional regime, or “BigData” regime. The deep connections between inference and the statistical mechanics of disordered systems will be emphasised and exploited, with a particular emphasis on the notion of phase transition. At the end of the course the students are expected to master some advanced mean-field techniques from physics, mathematically rigorous approaches as well as message-passing algorithms, which are part of the tool-box required to tackle fundamental questions such as: “When does data contains enough meaningful information to perform inference?”, or "Can we optimally extract this information from the data at low computational cost?”

Syllabus:

1. Bayesian inference, information theory and statistical mechanics: three close cousins

a. Statistical inference, Bayes formula and decision theory

b. Surprise, Shannon entropy and mutual information

c. Statistical mechanics of disordered systems, phase transitions and links with Bayesian inference

d. "Going mean-field", Concentration, replica, cavity and interpolation methods for the Ising model

2. "You’ll never beat me!" Information-theoretic phase transition

a. Gaussian denoising, matrix factorisation, high-dimensional regression and the perceptron

b. Why do ensembles matter? Concentration-of-measure in high-dimensional inference

c. The replica method for matrix factorisation and high-dimensional regression

d. "Let's clean the messup" Rigorous approach 1: the adaptive interpolation method for matrix factorisation

e. Rigorous approach 2: the cavity method for mismatched high-dimensional linear regression

3. "Beat me if you can" Algorithmic phase transition

a. Message-passing: belief-propagation and approximate message-passing

b. "Going big" Asymptotic state evolution analysis, and optimality of approximate messagepassing

Research Group: 
Next Lectures: 

Sign in