MENU

You are here

Low-rankness in Deep Learning

Speaker: 
Emanuele Zangrando
Institution: 
GSSI
Schedule: 
Friday, April 11, 2025 - 14:00
Abstract: 

 Neural networks have revolutionized numerous fields, demonstrating extraordinary success across a wide range of applications. However, their substantial memory footprint and high computational demands can make them impractical for deployment in resource-constrained environments, where hardware and energy limitations pose significant challenges.In recent years, a growing body of empirical evidence has shown that modern neural networks contain a striking degree of redundancy in their parameters. While sparsity often needs to be enforced explicitly within neural network parameters, low-rankness interestingly frequently emerges naturally in deep networks, suggesting it as a more natural structure to impose. Understanding the conditions under which this low-rank implicit bias manifests is of crucial importance, as it enables us to predict a priori when a compressed model can perform as well as a much larger one. Although considerable progress has been made in this direction, a comprehensive theoretical characterization remains missing. In this talk, we will first discuss the implicit bias of deep neural networks toward low-rank structures and its connection with the phenomenon of neural collapse, and then highlight their practical significance in large-scale applications. 

Sign in