Please note that the list below only shows forthcoming events, which may not include regular events that have not yet been entered for the forthcoming term. Please see the past events page for a list of all seminar series that the department has on offer.

 

Past events in this series


Thu, 30 Apr 2026

14:00 - 15:00
(This talk is hosted by Rutherford Appleton Laboratory)

Modern tasking approaches to simulate black holes (and other interesting phenomena): How can we make them fit to modern hardware?

Tobias Weinzierl
(Durham University)
Abstract

Speaker Tobias Weinzierl will be talking about: 'Modern tasking approaches to simulate black holes (and other interesting phenomena): How can we make them fit to modern hardware?'

Over the past decade, my team has developed a simulation code for binary black hole mergers that runs on dynamically adaptive Cartesian meshes. 
Its dynamic adaptivity, coupled with multiple numerical schemes operating at different scales and non-deterministic loads from puncture sources, makes task-based parallelisation a natural choice:
Task stealing across fine-grained work units balances the load across many CPU cores, while treating tasks as atomic compute units should---in theory---allow us to deploy seamlessly to accelerators.

In practice, it is far from straightforward.

Fine-grained tasks clash with accelerators, which thrive on large, homogeneous data access patterns;
task bursts on the CPU overwhelm tasking systems and produce suboptimal execution schedules;
and when tasks span address spaces, expensive memory movements kill performance.
Surprisingly, many mainstream tasking frameworks even lack the features our domain demands, i.e. to express key task concepts.


Our application serves as a powerful lens for examining these challenges. 
While our code base extends to other wave phenomena, Lagrangian techniques, and multigrid solvers, they all reveal the same fundamental tension: 
modern hardware increasingly struggles to accommodate modern HPC concepts, and it even challenges the notion that one solution fits all hardware components.

The talk proposes practical workarounds and solutions to these shortcomings, while all solutions are designed, wherever possible, to be upstreamed into mainstream software building blocks or at least decoupled from our particular PDE solver, making them broadly applicable to the community.

 

This talk is hosted by Rutherford Appleton Laboratory and will take place @ Harwell Campus, Didcot, OX11 0QX
 

Thu, 07 May 2026

14:00 - 15:00
Lecture Room 3

Private estimation in stochastic block models

Prof Po-Ling Loh
(Cambridge)
Abstract

Professor Po-Ling Loh will talk about; 'Private estimation in stochastic block models'


We study the problem of private estimation for stochastic block models, where the observation comes in the form of an undirected graph, and the goal is to partition the nodes into unknown, underlying communities. We consider a notion of differential privacy known as node differential privacy, meaning that two graphs are treated as neighbors if one can be transformed into the other by changing the edges connected to exactly one node. The goal is to develop algorithms with optimal misclassification error rates, subject to a certain level of differential privacy.

We present several algorithms based on private eigenvector extraction, private low-rank matrix estimation, and private SDP optimization. A key contribution of our work is a method for converting a procedure which is differentially private and has low statistical error on degree-bounded graphs to one that is differentially private on arbitrary graph inputs, while maintaining good accuracy (with high probability) on typical inputs. This is achieved by considering a certain smooth version of a map from the space of all undirected graphs to the space of bounded-degree graphs, which can be appropriately leveraged for privacy. We discuss the relative advantages of the algorithms we introduce and also provide some lower-bounds for the performance of any private community estimation algorithm.


This is joint work with Laurentiu Marchis, Ethan D'souza, and Tomas Flidr.

 

 


 

Thu, 14 May 2026

14:00 - 15:00
Lecture Room 3

Numerical analysis of oscillatory solutions of compressible flows

Prof Dr Maria Lukacova
(Johannes Gutenberg University Mainz)
Abstract

Speaker Prof Dr Maria Lukacova will talk about 'Numerical analysis of oscillatory solutions of compressible flows'

 

Oscillatory solutions of compressible flows arise in many practical situations.  An iconic example is the Kelvin-Helmholtz problem, where standard numerical methods yield oscillatory solutions. In such a situation,  standard tools of numerical analysis for partial differential equations are not applicable. 

We will show that structure-preserving numerical methods converge in general to generalised solutions, the so-called dissipative solutions. 
The latter describes the limits of oscillatory sequences. We will concentrate on the inviscid flows, the Euler equations of gas dynamics, and mention also the relevant results obtained for the viscous compressible flows, governed by the Navier-Stokes equations.

We discuss a concept of K-convergence that turns a weak convergence of numerical solutions into the strong convergence of
their empirical means to a dissipative solution. The latter satisfies a weak formulation of the Euler equations modulo the Reynolds turbulent stress.  We will also discuss suitable selection criteria to recover well-posedness of the Euler equations of gas dynamics. Theoretical results will be illustrated by a series of numerical simulations.  

 

 

Thu, 21 May 2026

14:00 - 15:00
Lecture Room 3

TBA

Prof Matthew J. Colbrook
(Cambridge)
Abstract

TBA 

Thu, 28 May 2026

14:00 - 15:00
Lecture Room 3

Reducing Sample Complexity in Stochastic Derivative-Free Optimization via Tail Bounds and Hypothesis Testing

Prof Luis Nunes Vicente
(Lehigh University)
Abstract

Professor Luis Nunes Vicente will talk about 'Reducing Sample Complexity in Stochastic Derivative-Free Optimization via Tail Bounds and Hypothesis Testing';

We introduce and analyze new probabilistic strategies for enforcing sufficient decrease conditions in stochastic derivative-free optimization, with the goal of reducing sample complexity and simplifying convergence analysis. First, we develop a new tail bound condition imposed on the estimated reduction in function value, which permits flexible selection of the power used in the sufficient decrease test, q in (1,2]. This approach allows us to reduce the number of samples per iteration from the standard O(delta^{−4}) to O(delta^{-2q}), assuming that the noise moment of order q/(q-1) is bounded. Second, we formulate the sufficient decrease condition as a sequential hypothesis testing problem, in which the algorithm adaptively collects samples until the evidence suffices to accept or reject a candidate step. This test provides statistical guarantees on decision errors and can further reduce the required sample size, particularly in the Gaussian noise setting, where it can approach O(delta^{−2-r}) when the decrease is of the order of delta^r. We incorporate both techniques into stochastic direct-search and trust-region methods for potentially non-smooth, noisy objective functions, and establish their global convergence rates and properties. 

This is joint work with Anjie Ding, Francesco Rinaldi, and Damiano Zeffiro.

 

Thu, 04 Jun 2026

14:00 - 15:00
Lecture Room 3

TBA

Prof Fernando De Teran
(University of Madrid Carlos III)
Abstract

TBA

Thu, 11 Jun 2026

14:00 - 15:00
Lecture Room 3

Optimization Algorithms for Bilevel Learning with Applications to Imaging

Dr Lindon Roberts
(Melbourne University)
Abstract

Dr Lindon Roberts will talk about: 'Optimization Algorithms for Bilevel Learning with Applications to Imaging'

Many imaging problems, such as denoising or inpainting, can be expressed as variational regularization problems. These are optimization problems for which many suitable algorithms exist. We consider the problem of learning suitable regularizers for imaging problems from example (training) data, which can be formulated as a large-scale bilevel optimization problem. 

In this talk, I will introduce new deterministic and stochastic algorithms for bilevel optimization, which require no or minimal hyperparameter tuning while retaining convergence guarantees. 

This is joint work with Mohammad Sadegh Salehi and Matthias Ehrhardt (University of Bath), and Subhadip Mukherjee (IIT Kharagpur).

 

 

Thu, 18 Jun 2026

14:00 - 15:00
Lecture Room 3

TBA

Prof Daniele Boffi
(King Abdullah University of Science and Technology (KAUST))
Abstract

TBA

Thu, 12 Nov 2026

14:00 - 15:00

TBA

Dr Peter Braam
(Oxford Physics)
Abstract

TBA