Author
Baskerville, N
Keating, J
Mezzadri, F
Najnudel, J
Journal title
Journal of Statistical Mechanics: Theory and Experiment
DOI
10.1088/1742-5468/abfa1e
Volume
21
Last updated
2024-03-31T17:38:56.05+01:00
Abstract
The loss surfaces of deep neural networks have been the subject of several studies, theoretical and experimental, over the last few years. One strand of work considers the complexity, in the sense of local optima, of high dimensional random functions with the aim of informing how local optimisation methods may perform in such complicated settings. Prior work of Choromanska et al (2015) established a direct link between the training loss surfaces of deep multi-layer perceptron networks and spherical multi-spin glass models under some very strong assumptions on the network and its data. In this work, we test the validity of this approach by removing the undesirable restriction to ReLU activation functions. In doing so, we chart a new path through the spin glass complexity calculations using supersymmetric methods in random matrix theory which may prove useful in other contexts. Our results shed new light on both the strengths and the weaknesses of spin glass models in this context.

Symplectic ID
1171546
Favourite
Off
Publication type
Journal Article
Publication date
01 Jun 2021
Please contact us with feedback and comments about this page. Created on 15 Apr 2021 - 10:32.