Events

Click here for a list of our past events and here for a presentation of the Rome Centre on Mathematics for Modelling and Data ScienceS.

If you want to receive updates on RoMaDS activities, write an email to salvi(at)mat.uniroma2.it to be added to our mailing list.

 

Upcoming events

25.09.2025 - Workshop: A day on Random Matrix Theory and Deep Learning
09h30-17h00, Department of Mathematics, Aula Gismondi

Here the link to the webpage of the event.



06.10.2025 - Seminar: Federico Bassetti (Politecnico di Milano), Scaling Limits of Bayesian Neural Networks: Gaussian Processes and Mixtures
14h00-15h00, Department of Mathematics, Aula Dal Passo.
Abstract

In large neural networks, key theoretical insights emerge in the infinite-width limit, where the number of neurons per layer grows while depth stays fixed. In this regime, networks with Gaussian-initialized weights define a mixture of Gaussian processes with random covariance, which converges in the infinite-width limit to a pure Gaussian process with deterministic covariance. However, this Gaussian limit sacrifices descriptive power, as it lacks the ability to learn dependent features and produce output correlations that reflect observed labels. Motivated by these limitations, we explore deep linear networks in the proportional limit, where both depth and width diverge at a fixed ratio. In this setting, the network converges to a nontrivial Gaussian mixture, both at the prior and posterior level. This structure allows the network to capture dependencies in the outputs—an ability lost in the infinite-width limit but retained in finite networks. Our contribution extends previous works by explicitly characterizing, for linear activation functions, the limiting distribution as a nontrivial mixture of Gaussians.
The talk is based on

F. Bassetti, L. Ladelli, P. Rotondo. Proportional infinite-width infinite-depth limit for deep linear neural networks (2024+) https://arxiv.org/abs/2411.15267

Bassetti,F. Gherardi, M. Ingrosso, A. Pastore, M. Rotondo, P. Feature learning in finite-width Bayesian deep linear networks with multiple outputs and convolutional layers. Journal of Machine Learning Research 26 (2025) 1-35.