Wed, 06 Nov 2024
11:00
L4

Probabilistic Schwarzian Field Theory

Ilya Losev
(Cambridge University)
Abstract

Schwarzian Theory is a quantum field theory which has attracted a lot of attention in the physics literature in the context of two-dimensional quantum gravity, black holes and AdS/CFT correspondence. It is predicted to be universal and arise in many systems with emerging conformal symmetry, most notably in Sachdev--Ye--Kitaev random matrix model and Jackie--Teitelboim gravity.

In this talk we will discuss our recent progress on developing rigorous mathematical foundations of the Schwarzian Field Theory, including rigorous construction of the corresponding measure, calculation of both the partition function and a natural class of correlation functions, and a large deviation principle.

Mon, 04 Nov 2024
15:30
L3

Statistical Inference for weakly interacting diffusions and their mean field limit

Prof Greg Pavliotis
(Imperial College )
Abstract

We consider the problem of parametric and non-parametric statistical inference for systems of weakly interacting diffusions and of their mean field limit. We present several parametric inference methodologies, based on stochastic gradient descent in continuous time, spectral methods and the method of moments. We also show how one can perform fully nonparametric Bayesian inference for the mean field McKean-Vlasov PDE. The effect of non-uniqueness of stationary states of the mean field dynamics on the inference problem is elucidated.

Make the Most of Your Society Journal.
Simpson, M Laubenbacher, R Baker, R Bulletin of mathematical biology volume 86 issue 10 120-120 (20 Aug 2024)
Mon, 02 Dec 2024

14:00 - 15:00
Lecture Room 3

Enhancing Accuracy in Deep Learning using Marchenko-Pastur Distribution

Leonid Beryland
(Penn State University)
Abstract

We begin with a short overview of Random Matrix Theory (RMT), focusing on the Marchenko-Pastur (MP) spectral approach. 

Next, we present recent analytical and numerical results on accelerating the training of Deep Neural Networks (DNNs) via MP-based pruning ([1]). Furthermore, we show that combining this pruning with L2 regularization allows one to drastically decrease randomness in the weight layers and, hence, simplify the loss landscape. Moreover, we show that the DNN’s weights become deterministic at any local minima of the loss function. 
 

Finally, we discuss our most recent results (in progress) on the generalization of the MP law to the input-output Jacobian matrix of the DNN. Here, our focus is on the existence of fixed points. The numerical examples are done for several types of DNNs: fully connected, CNNs and ViTs. These works are done jointly with PSU PhD students M. Kiyashko, Y. Shmalo, L. Zelong and with E. Afanasiev and V. Slavin (Kharkiv, Ukraine). 

 

[1] Berlyand, Leonid, et al. "Enhancing accuracy in deep learning using random matrix theory." Journal of Machine Learning. (2024).

Information about our courses and research groups, as well as some of our most commonly asked questions about funding, applications and graduate study at Oxford.
Can you hear the Planck mass?
De Luca, G De Ponti, N Mondino, A Tomasiello, A Journal of High Energy Physics volume 2024 issue 8 (16 Aug 2024)
Half-isolated zeros and zero-density estimates
Maynard, J Pratt, K International Mathematics Research Notices volume 2024 issue 19 12978-13014 (06 Sep 2024)
Four partial scholarships for students on the MSc in Mathematical and Computational Finance in 2025-26. These scholarships in particular aim to improve access to graduate study for those from socio-economically disadvantaged backgrounds.
Fridays@4 is a programme of weekly events for graduate students, postdoctoral research associates and research fellows.
Subscribe to