The impact of confinement on the deformation of an elastic particle under axisymmetric tube flow
Finney, S Hennessy, M Muench, A Waters, S IMA Journal of Applied Mathematics volume 89 issue 3 498-532 (18 Sep 2024)
Philip lecturing
Philip Maini has been awarded the Sylvester Medal by the Royal Society for his contributions to mathematical biology, especially the interdisciplinary modelling of biomedical phenomena and systems. He says of the award: "Receiving this prize is a truly humbling experience when I look at past winners. It is recognition of the important role that mathematical biology is now playing both in mathematics and in the life sciences.
Mon, 21 Oct 2024
14:15
L4

Machine learning detects terminal singularities

Sara Veneziale
(Imperial College London)
Abstract

In this talk, I will describe recent work in the application of machine learning to explore questions in algebraic geometry, specifically in the context of the study of Q-Fano varieties. These are Q-factorial terminal Fano varieties, and they are the key players in the Minimal Model Program. In this work, we ask and answer if machine learning can determine if a toric Fano variety has terminal singularities. We build a high-accuracy neural network that detects this, which has two consequences. Firstly, it inspires the formulation and proof of a new global, combinatorial criterion to determine if a toric variety of Picard rank two has terminal singularities. Secondly, the machine learning model is used directly to give the first sketch of the landscape of Q-Fano varieties in dimension eight. This is joint work with Tom Coates and Al Kasprzyk.

Wed, 06 Nov 2024
11:00
L4

Probabilistic Schwarzian Field Theory

Ilya Losev
(Cambridge University)
Abstract

Schwarzian Theory is a quantum field theory which has attracted a lot of attention in the physics literature in the context of two-dimensional quantum gravity, black holes and AdS/CFT correspondence. It is predicted to be universal and arise in many systems with emerging conformal symmetry, most notably in Sachdev--Ye--Kitaev random matrix model and Jackie--Teitelboim gravity.

In this talk we will discuss our recent progress on developing rigorous mathematical foundations of the Schwarzian Field Theory, including rigorous construction of the corresponding measure, calculation of both the partition function and a natural class of correlation functions, and a large deviation principle.

Mon, 04 Nov 2024
15:30
L3

Statistical Inference for weakly interacting diffusions and their mean field limit

Prof Greg Pavliotis
(Imperial College )
Abstract

We consider the problem of parametric and non-parametric statistical inference for systems of weakly interacting diffusions and of their mean field limit. We present several parametric inference methodologies, based on stochastic gradient descent in continuous time, spectral methods and the method of moments. We also show how one can perform fully nonparametric Bayesian inference for the mean field McKean-Vlasov PDE. The effect of non-uniqueness of stationary states of the mean field dynamics on the inference problem is elucidated.

Make the Most of Your Society Journal.
Simpson, M Laubenbacher, R Baker, R Bulletin of mathematical biology volume 86 issue 10 120-120 (20 Aug 2024)
Mon, 02 Dec 2024

14:00 - 15:00
Lecture Room 3

Enhancing Accuracy in Deep Learning using Marchenko-Pastur Distribution

Leonid Beryland
(Penn State University)
Abstract

We begin with a short overview of Random Matrix Theory (RMT), focusing on the Marchenko-Pastur (MP) spectral approach. 

Next, we present recent analytical and numerical results on accelerating the training of Deep Neural Networks (DNNs) via MP-based pruning ([1]). Furthermore, we show that combining this pruning with L2 regularization allows one to drastically decrease randomness in the weight layers and, hence, simplify the loss landscape. Moreover, we show that the DNN’s weights become deterministic at any local minima of the loss function. 
 

Finally, we discuss our most recent results (in progress) on the generalization of the MP law to the input-output Jacobian matrix of the DNN. Here, our focus is on the existence of fixed points. The numerical examples are done for several types of DNNs: fully connected, CNNs and ViTs. These works are done jointly with PSU PhD students M. Kiyashko, Y. Shmalo, L. Zelong and with E. Afanasiev and V. Slavin (Kharkiv, Ukraine). 

 

[1] Berlyand, Leonid, et al. "Enhancing accuracy in deep learning using random matrix theory." Journal of Machine Learning. (2024).

Information about our courses and research groups, as well as some of our most commonly asked questions about funding, applications and graduate study at Oxford.
Subscribe to