Date
Fri, 15 Nov 2024
15:00
Location
L5
Speaker
Inés García-Redondo
Organisation
Imperial College
Bounding and predicting the generalization gap of overparameterized neural networks remains a central open problem in theoretical machine learning. There is a recent and growing body of literature that proposes the framework of fractals to model optimization trajectories of neural networks, motivating generalization bounds and measures based on the fractal dimension of the trajectory. Notably, the persistent homology dimension has been proposed to correlate with the generalization gap. In this talk, I will present an empirical evaluation of these persistent homology-based generalization measures, with an in-depth statistical analysis. This study reveals confounding effects in the observed correlation between generalization and topological measures due to the variation of hyperparameters. We also observe that fractal dimension fails to predict generalization of models trained from poor initializations; and reveal the intriguing manifestation of model-wise double descent in these topological generalization measures. This is joint work with Charlie B. Tan, Qiquan Wang, Michael M. Bronstein and Anthea Monod.
 
Last updated on 13 Nov 2024, 10:38am. Please contact us with feedback and comments about this page.