You are here

Information Theory and Inference

External Lecturer: 
Jean Barbier
Course Type: 
PhD Course
Academic Year: 
November - December
26 h

10 x 2 hours + 2 x 3 hrs Labs = 26 hours (5 weeks: 04/11 - 04/12)

  1. Bayesian inference, information theory and statistical mechanics:
    i. Statistical inference, Bayes formula and decision theory
    ii. Surprise, Shannon entropy and mutual information
    iii. Statistical mechanics of disordered systems 101, and links with Bayesian inference
    iv. Lab 1
  2. Information-theoretic limits
    i. Replica symmetric formula for the mutual information
    ii. A powerful (exact) heuristic: the replica method
    iii. Why ensembles matter? Concentration of the free energy
    iv. Replica symmetry in inference: overlap concentration
    v. Rigorous approach 1: the (adaptive) interpolation method
    vi. Rigorous approach 2: the cavity method
    vii. Lab 2
  3. Algorithmic limits
    i. Message-passing
    ii. State evolution, and optimality of approximate message-passing

Please, notice that this is a course belonging to Data Science Excellence Department programme. MAMA PhD students can plan 33% of their credits (i.e. 50 hrs) from this programme.

Next Lectures: 
Wednesday, December 2, 2020 - 14:00 to 16:00
Friday, December 4, 2020 - 14:00 to 16:00

Sign in