Predicting Feynman periods in $φ^4$-theory
Balduf, P
Shaban, K
(24 Mar 2024)
http://arxiv.org/abs/2403.16217v1
Combinatorial proof of a non-renormalization theorem
Balduf, P
Gaiotto, D
(06 Aug 2024)
Randomness and early termination: what makes a game exciting?
Guo, G
Howison, S
Possamai, D
Reisinger, C
Probability Theory and Related Fields
(02 Dec 2024)
Dyson–Schwinger equations in minimal subtraction
Balduf, P
Annales de l’Institut Henri Poincaré D
(12 Apr 2023)
Statistics of Feynman amplitudes in ϕ4-theory
Balduf, P
Journal of High Energy Physics
volume 2023
issue 11
160-160
(22 Nov 2023)
Perturbation Theory of Transformed Quantum Fields
Balduf, P
Mathematical Physics, Analysis and Geometry
volume 23
issue 3
33
(10 Sep 2020)
Fri, 29 Nov 2024
12:00 -
13:00
C5
On Lusztig’s local Langlands correspondence and functoriality
Emile Okada
(National University of Singapore)
Abstract
In ’95 Lusztig gave a local Langlands correspondence for unramified representations of inner to split adjoint groups combining many deep results from type theory and geometric representation theory. In this talk, I will present a gentle reformulation of his construction revealing some interesting new structures, and with a view toward proving functoriality results in this framework.
This seminar is organised jointly with the Junior Algebra and Representation Theory Seminar - all are very welcome!
Numerical simulations of laser-driven experiments of ion acceleration in stochastic magnetic fields
Moczulski, K
Campbell, T
Arrowsmith, C
Bott, A
Sarkar, S
Schekochihin, A
Gregori, G
Physics of Plasmas
volume 31
issue 12
(04 Dec 2024)
Fri, 15 Nov 2024
15:00
15:00
L5
On the Limitations of Fractal Dimension as a Measure of Generalization
Inés García-Redondo
(Imperial College)
Abstract
Bounding and predicting the generalization gap of overparameterized neural networks remains a central open problem in theoretical machine learning. There is a recent and growing body of literature that proposes the framework of fractals to model optimization trajectories of neural networks, motivating generalization bounds and measures based on the fractal dimension of the trajectory. Notably, the persistent homology dimension has been proposed to correlate with the generalization gap. In this talk, I will present an empirical evaluation of these persistent homology-based generalization measures, with an in-depth statistical analysis. This study reveals confounding effects in the observed correlation between generalization and topological measures due to the variation of hyperparameters. We also observe that fractal dimension fails to predict generalization of models trained from poor initializations; and reveal the intriguing manifestation of model-wise double descent in these topological generalization measures. This is joint work with Charlie B. Tan, Qiquan Wang, Michael M. Bronstein and Anthea Monod.