AI assisted triage of UK patients in mental health care services: a qualitative focus group study of patients’ attitudes
Smith, K Hamer-Hunt, J Kormilitzin, A Page, H Joyce, D Cipriani, A BMC Psychiatry volume 26 issue 1 (13 Jan 2026)
Tue, 03 Feb 2026
15:30

Foundations for derived analytic and differential geometry

Kobi Kremnitzer
((Mathematical Institute University of Oxford))
Abstract

In this talk I will describe how bornological spaces give a foundation for derived geometries. This works over any Banach ring allowing to define analytic and differential geometry over the integers. I will discuss applications of this approach such as the representability of certain moduli spaces and Galois actions on the cohomology of differetiable manifolds admitting a \Q-form.

Causal transport on path space
Cont, R Lim, F Annals of Probability
Thu, 26 Feb 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

IterativeCUR: One small sketch for big matrix approximations

Nathaniel Pritchard
((Mathematical Institute University of Oxford))
Abstract

The computation of accurate low-rank matrix approximations is central to improving the scalability of various techniques in machine learning, uncertainty quantification, and control. Traditionally, low-rank approximations are constructed using SVD-based approaches such as truncated SVD or RandomizedSVD. Although these SVD approaches---especially RandomizedSVD---have proven to be very computationally efficient, other low-rank approximation methods can offer even greater performance. One such approach is the CUR decomposition, which forms a low-rank approximation using direct row and column subsets of a matrix. Because CUR uses direct matrix subsets, it is also often better able to preserve native matrix structures like sparsity or non-negativity than SVD-based approaches and can facilitate data interpretation in many contexts. This paper introduces IterativeCUR, which draws on previous work in randomized numerical linear algebra to build a new algorithm that is highly competitive compared to prior work: (1) It is adaptive in the sense that it takes as an input parameter the desired tolerance, rather than an a priori guess of the numerical rank. (2) It typically runs significantly faster than both existing CUR algorithms and techniques such as RandomizedSVD, in particular when these methods are run in an adaptive rank mode. Its asymptotic complexity is  $\mathcal{O}(mn + (m+n)r^2 + r^3)$ for an $m\times n$ matrix of numerical rank $r$. (3) It relies on a single small sketch from the matrix that is successively downdated as the algorithm proceeds.

Thu, 05 Mar 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

Random Embeddings for Global Optimization: Convergence Results Beyond Low Effective Dimension

Roy Makhlouf
(UC Louvain)
Abstract

Roy Makhlouf will talk about: 'Random Embeddings for Global Optimization: Convergence Results Beyond Low Effective Dimension'
 

Timely optimization problems are high-dimensional, calling for dimensionality reduction techniques to solve them efficiently. The random embedding strategy, which optimizes the objective along a low-dimensional subspace of the search space, is arguably the simplest possible dimensionality reduction method. Recent works quantify the probability of success of this strategy to solve the original problem by lower bounding the probability of a random subspace to intersect the set of approximate global minimizers. These works showed that, when the objective has low effective dimension (i.e., is only varying along a low-dimensional subspace of the search space), random embeddings of sufficiently large dimension solve the original high-dimensional problem with probability one. In this work, we relax the low effective dimension assumption by considering objectives with anisotropic variability, namely, Lipschitz continuous functions whose Lipschitz constant is small (though nonzero) when the function is restricted to a high-dimensional subspace. Exploiting tools from stochastic geometry, we lower bound the probability for a random subspace to intersect the set of approximate global minimizers of these objectives, hence, the probability of random embeddings to succeed in solving (approximately) the original global optimization problem. Our findings offer deeper insights into the role of the dimension of the optimization problem in this probability of success.

Thu, 05 Feb 2026

12:00 - 12:30
Lecture Room 4, Mathematical Institute

A Very Short Introduction to Ptychographic Image Reconstruction

Dr Jaroslav Fowkes
((Mathematical Institute University of Oxford))
Abstract

Dr Jari Fowkes will talk about; 'A Very Short Introduction to Ptychographic Image Reconstruction'

 

I will present a very short introduction to the mathematics behind the scientific imaging technique known as ptychography, starting with a brief overview of the physics model and the various simplifications required, before moving on to the main ptychography inverse problem and the three principal classes of optimization algorithms currently being used in practice. 

Tue, 03 Mar 2026
16:00
L6

The hyperbolic lattice point problem (joint with number theory)

Stephen Lester
Abstract
In this talk I will discuss the hyperbolic circle problem for $SL_2(\mathbb Z)$. Given two points $z, w$ that lie in the hyperbolic upper half‑plane, the problem is to determine the number of $SL_2(\mathbb Z)$ translates of w that lie in the hyperbolic disk centred at z with radius $arcosh(R/2)$ for large $R$. Selberg proved that the error term in this problem is $O(R^{2/3})$. I will describe some recent work in which we improve the error term to $o(R^{2/3})$ as $R$ tends to infinity, for $z,w$ that are CM-points of different, square-free discriminants. This is joint work with Dimitrios Chatzakos, Giacomo Cherubini, and Morten Risager.



 

Tue, 24 Feb 2026
16:00
L6

Random Matrices and Free Cumulants

Roland Speicher
Abstract

The asymptotic large N limit of random matrices often transforms classical concepts (independence, cumulants, partitions of sets) into their free counter-parts (free independence, free cumulants, non-crossing partitions) and the limit of random matrices gives rise to interesting operator algebras. I will explain these relations, with a particular emphasis on the effect of non-linear functions on the entries of random matrices

Tue, 17 Feb 2026
16:00
L6

Graph and Chaos Theories Combined to Address Scrambling of Quantum Information (with Arkady Kurnosov and Sven Gnutzmann)

Uzi Smilansky
Abstract

Given a quantum Hamiltonian, represented as an $N \times N$ Hermitian matrix $H$, we derive an expression for the largest Lyapunov exponent of the classical trajectories in the phase space appropriate for the dynamics induced by $H$. To this end we associate to $H$ a graph with $N$ vertices and derive a quantum map on functions defined on the directed edges of the graph. Using the semiclassical approach in the reverse direction we obtain the corresponding classical evolution (Liouvillian) operator. Using ergodic theory methods (Sinai, Ruelle, Bowen, Pollicott\ldots) we obtain closed expressions for the Lyapunov exponent, as well as for its variance. Applications for random matrix models will be presented.

Subscribe to