MENU

You are here

Towards a theory of neural networks: a tale of physics, neuroscience and machine learning

Speaker: 
Sebastian Goldt
Schedule: 
Monday, January 20, 2020 - 10:00
Location: 
A-005
Abstract: 

The key challenge for any theory of deep learning is to explain how neural networks can generalise well from examples in practice, when classical learning theory would predict that they cannot. In this talk, I will argue that tools and concepts from theoretical physics can help build such a theory. In particular, I will discuss two recent works where we used tools and concepts from statistical physics to analyse the dynamics of neural networks and the impact of structured data sets. But the interaction between machine learning and the sciences goes both ways. I will give two examples of this by discussing how learning raises fundamental questions for the thermodynamics of computation, and how modern machine learning techniques can help analyse the large, noisy data sets that arise in neuroscience. Finally, I will outline some open challenges for both the practice and the theory of neural networks, and sketch how ideas from neuroscience might help addressing some of them.

Sign in