Mon, 19 Jan 2026
15:30
L5

Complete classification of the Dehn functions of Bestvina—Brady groups

Jerónimo García-Mejía
(University of Warwick)
Abstract

Introduced by Bestvina and Brady in 1997, Bestvina—Brady groups form an important class of examples in geometric group theory and topology, known for exhibiting unusual finiteness properties. In this talk, I will focus on the Dehn functions of finitely presented Bestvina—Brady groups. Very roughly speaking, the Dehn function of a group measures how difficult it is to fill loops by discs in spaces associated to the group, and captures geometric information that is invariant under coarse equivalence. After reviewing known results, I will present a classification of the Dehn functions of Bestvina—Brady groups. This talk is based on joint work with Yu-Chan Chang and Matteo Migliorini.

Doncha love AI!! So clever! Does all your work in, like, one nanosecond! It's, like, your bestie!!
 
Doncha hate AI?? Thinks it's so clever! But you can't trust it!
Tue, 10 Mar 2026
12:30
C4

Quantifying Spatial Relationships in Labelled Data with Topology

Abhinav Natarajan
(OCIAM Oxford)
Abstract

Topological data analysis (TDA) deals with quantifying the "shape of data" using tools from algebraic topology and computational geometry. In many contexts, data comes equipped with a labelling (for example, cell type annotations in spatial biology), and one is interested in quantifying not just the global structure of the data but the spatial relationships between labelled subsets of the data. I will give a brief introduction to TDA and then talk about chromatic Delaunay filtrations, a recently developed family of computational methods in TDA that can address the problem of quantifying spatial relationships in labelled point cloud datasets.

To start Hilary term, join us in N4.01 on Friday 23rd at 12:30 pm for free pizza and a fun quiz competition. This is the perfect Mathematrix event to come to if you’ve been wanting to swing by for a while and haven’t had the opportunity. 

And see our term card below.

Tue, 10 Feb 2026
12:30
C4

Models for subglacial floods during surface lake drainage events

Harry Stuart
(OCIAM Oxford)
Abstract

As temperatures are increasing, so is the presence of meltwater lakes sitting on the surface of the Greenland Ice Sheet. Such lakes have the possibility of draining through cracks in the ice to the bedrock. Observed discharge rates have found that these lakes can drain at three times the flow rate of Niagara Falls. Current models of subglacial drainage systems are unable to cope with such a large and sudden volume of water. This motivates the idea of a 'subglacial blister' which propagates and slowly dissipates underneath the ice sheet. We present a basic hydrofracture model for understanding this process, before carrying out a number of extensions to observe the effects of turbulence, topography, leak-off and finite ice thickness.

AI assisted triage of UK patients in mental health care services: a qualitative focus group study of patients’ attitudes
Smith, K Hamer-Hunt, J Kormilitzin, A Page, H Joyce, D Cipriani, A BMC Psychiatry volume 26 issue 1 (13 Jan 2026)
Tue, 03 Feb 2026
15:30

Foundations for derived analytic and differential geometry

Kobi Kremnitzer
((Mathematical Institute University of Oxford))
Abstract

In this talk I will describe how bornological spaces give a foundation for derived geometries. This works over any Banach ring allowing to define analytic and differential geometry over the integers. I will discuss applications of this approach such as the representability of certain moduli spaces and Galois actions on the cohomology of differetiable manifolds admitting a \Q-form.

Causal transport on path space
Cont, R Lim, F Annals of Probability
Thu, 26 Feb 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

IterativeCUR: One small sketch for big matrix approximations

Nathaniel Pritchard
((Mathematical Institute University of Oxford))
Abstract

The computation of accurate low-rank matrix approximations is central to improving the scalability of various techniques in machine learning, uncertainty quantification, and control. Traditionally, low-rank approximations are constructed using SVD-based approaches such as truncated SVD or RandomizedSVD. Although these SVD approaches---especially RandomizedSVD---have proven to be very computationally efficient, other low-rank approximation methods can offer even greater performance. One such approach is the CUR decomposition, which forms a low-rank approximation using direct row and column subsets of a matrix. Because CUR uses direct matrix subsets, it is also often better able to preserve native matrix structures like sparsity or non-negativity than SVD-based approaches and can facilitate data interpretation in many contexts. This paper introduces IterativeCUR, which draws on previous work in randomized numerical linear algebra to build a new algorithm that is highly competitive compared to prior work: (1) It is adaptive in the sense that it takes as an input parameter the desired tolerance, rather than an a priori guess of the numerical rank. (2) It typically runs significantly faster than both existing CUR algorithms and techniques such as RandomizedSVD, in particular when these methods are run in an adaptive rank mode. Its asymptotic complexity is  $\mathcal{O}(mn + (m+n)r^2 + r^3)$ for an $m\times n$ matrix of numerical rank $r$. (3) It relies on a single small sketch from the matrix that is successively downdated as the algorithm proceeds.

Subscribe to