A pale imitation of its former glories, MTV will no doubt claim a large chapter in music  history. At its peak in the 90s and early 00s, you got the impression that writing a song was an excuse for making a video. 

Some songs, such as this one, are perhaps better known for the video than the song itself. Which would be a shame as this is one of Blur's best as Graham Coxon's lyrics chart his recovery from alcoholism and how, after giving up drinking, he would relax by watching television, drinking coffee and writing songs.

Wed, 27 May 2026

11:00 - 13:00
L4

Extreme Diffusion (CDT Workshop)

Ivan Corwin
Abstract

Two hundred years ago, Robert Brown observed the statistics of the motion of grains of pollen in water. It took almost one hundred years for Einstein and others to develop an effective theory describing this motion as that of a random walker. In this talk, I will challenge a key implication of this well established theory. When studying systems with very large numbers of particles diffusing together, I will argue that the Einstein random walk theory breaks down when it comes to predicting the statistical behavior of extreme particles—those that move the fastest and furthest in the system. In its place, I will describe a new theory of extreme diffusion which captures the effect of the hidden environment in which particles diffuse together and allows us to interrogate that environment by studying extreme particles. I will highlight one piece of mathematics that led us to develop this theory—a non-commutative binomial theorem—and hint at other connections to integrable probability, quantum integrable systems and stochastic PDEs.

Thu, 18 Jun 2026

16:00 - 17:00
L5

TBA

Adam Jones
((Mathematical Institute University of Oxford))
Abstract

TBA

Thu, 11 Jun 2026
12:00
Lecture Room 4, Mathematical Institute

TBA

Katherine Pearce
(University of Texas at Austin)
Abstract

TBA

Thu, 04 Jun 2026
12:00
Lecture Room 4, Mathematical Institute

TBA

Lorenzo Lazzarino
((Mathematical Institute University of Oxford))
Abstract

TBA

Thu, 14 May 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

Regularization Methods for Hierarchical Programming

Daniel Cortild
((Mathematical Institute University of Oxford))
Abstract

We consider hierarchical variational inequality problems, or more generally, variational inequalities defined over the set of zeros of a monotone operator. This framework includes convex optimization over equilibrium constraints and equilibrium selection problems. In a real Hilbert space setting, we combine a Tikhonov regularization and a proximal penalization to develop a flexible double-loop method for which we prove asymptotic convergence and provide rate statements in terms of gap functions. Our method is flexible, and effectively accommodates a large class of structured operator splitting formulations for which fixed-point encodings are available. 

 

Joint work with Meggie Marschner, and Mathias Staudigl (University of Mannheim)

Thu, 07 May 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

Adaptive preconditioning for linear least-squares problems via iterative CUR

Jung Eun Huh
((Mathematical Institute University of Oxford))
Abstract

Speaker Jung Eun Huh will talk about: 'Adaptive preconditioning for linear least-squares problems via iterative CUR'


Large-scale linear least-squares problems arise in many areas of computational science and data analysis, where efficiency and scalability are crucial. In this talk, we introduce a randomized preconditioning framework for iterative solvers based on low-rank approximations of small sketches of the original problem. The key idea is to iteratively construct low-rank preconditioners that reshape the singular value distribution in a favourable way. By tightly coupling the preconditioning and Krylov solving phases within an iterative CUR decomposition -- a low-rank approximation built from selected of columns and rows of the original matrix -- the proposed algorithm achieves faster and earlier convergence than existing methods. The algorithm performs particularly well on problems that are large in both dimensions, as well as on sparse and ill-conditioned systems. 

This is a joint work with Coralia Cartis and Yuji Nakatsukasa.

 

 

Thu, 30 Apr 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

Structure-preserving finite elements and the convergence of augmented Lagrangian methods

Charles Parker II
(U.S Naval Research Lab)
Abstract

Charles Parker II will be talking about: 'Structure-preserving finite elements and the convergence of augmented Lagrangian methods'

Problems with physical constraints, such as the incompressibility constraint for mass conservation in fluids or Gauss's laws for electric and magnetic fields, result in generalized saddle point systems. So-called structure-preserving finite elements respect the constraints pointwise, resulting in more physically accurate solutions that are typically robust with respect to some problem parameters. However, constructing these finite elements may involve complicated spaces for the Lagrange multiplier variables. Augmented Lagrangian methods (ALMs) provide one process to compute the solution without the need for an explicit basis for the Lagrange multiplier space. In this talk, we present new convergence estimates for a standard ALM method, sometimes called the iterated penalty method, applied to structure-preserving discretizations of linear saddle point systems.

Subscribe to