Please note that the list below only shows forthcoming events, which may not include regular events that have not yet been entered for the forthcoming term. Please see the past events page for a list of all seminar series that the department has on offer.

 

Past events in this series


Thu, 12 Feb 2026

14:00 - 15:00
Lecture Room 3

The Dean–Kawasaki Equation: Theory, Numerics, and Applications

Prof Ana Djurdjevac
(Mathematical Institute - University of Oxford)
Abstract

Professor Ana Djurdjevac will talk about; 'The Dean–Kawasaki Equation: Theory, Numerics, and Applications'

 

The Dean–Kawasaki equation provides a stochastic partial differential equation description of interacting particle systems at the level of empirical densities and has attracted considerable interest in statistical physics, stochastic analysis, and applied modeling. In this work, we study analytical and numerical aspects of the Dean–Kawasaki equation, with a particular focus on well-posedness, structure preservation, and possible discretization strategies. In addition, we extend the framework to the Dean–Kawasaki equation posed on smooth hypersurfaces. We discuss applications of the Dean–Kawasaki framework to particle-based models arising in biological systems and modeling social dynamics.

Thu, 19 Feb 2026

14:00 - 15:00
Lecture Room 3

Subspace Correction Methods for Convex Optimization: Algorithms, Theory, and Applications

Jongho Park
(King Abdullah University of Science and Technology (KAUST))
Abstract

Speaker Yongho Park will talk about 'Subspace Correction Methods for Convex Optimization: Algorithms, Theory, and Applications'

This talk considers a framework of subspace correction methods for convex optimization, which provides a unified perspective for the design and analysis of a wide range of iterative methods, including advanced domain decomposition and multigrid methods. We first develop a convergence theory for parallel subspace correction methods based on the observation that these methods can be interpreted as nonlinearly preconditioned gradient descent methods. This viewpoint leads to a simpler and sharper analysis compared with existing approaches. We further show how the theory can be extended to semicoercive and nearly semicoercive problems. In addition, we explore connections between subspace correction methods and other classes of iterative algorithms, such as alternating projection methods, through the lens of convex duality, thereby enabling a unified treatment. Several applications are presented, including nonlinear partial differential equations, variational inequalities, and mathematical imaging problems. The talk concludes with a discussion of relevant and emerging research directions.

Thu, 26 Feb 2026

14:00 - 15:00
Lecture Room 3

Paving the way to a T-coercive method for the wave equation

Carolina Urzua Torres
(TU Delft)
Abstract

Dr Carolina Urzua Torres will talk about 'Paving the way to a T-coercive method for the wave equation'

Space-time Galerkin methods are gradually becoming popular, since they allow adaptivity and parallelization in space and time simultaneously. A lot of progress has been made for parabolic problems, and its success has motivated an increased interest in finding space-time formulations for the wave equation that lead to unconditionally stable discretizations. In this talk I will discuss some of the challenges that arise and some recent work in this direction.

In particular, I will present what we see as a first step toward introducing a space-time transformation operator $\OT$ that establishes $\OT$-coercivity for the weak variational formulation of the wave equation in space and time on bounded Lipschitz domains. As a model problem, we study the ordinary differential equation (ODE) $u'' + \mu u = f$ for $\mu>0$, which is linked to the wave equation via a Fourier expansion in space. For its weak formulation, we introduce a transformation operator $\OT_\mu$ that establishes $\OT_\mu$-coercivity of the bilinear form yielding an unconditionally stable Galerkin-Bubnov formulation with error estimates independent of $\mu$. The novelty of the current approach is the explicit dependence of the transformation on $\mu$ which, when extended to the framework of partial differential equations, yields an operator acting in both time and space. We pay particular attention to keeping the trial space as a standard Sobolev space, simplifying the error analysis, while only the test space is modified.
The theoretical results are complemented by numerical examples.  

Thu, 05 Mar 2026

14:00 - 15:00
Lecture Room 3

Resonances as a computational tool

Katharina Schratz
(Sorbonne University)
Abstract

Speaker Katharina Schratz will talk about 'Resonances as a computational tool'

 

A large toolbox of numerical schemes for dispersive equations has been established, based on different discretization techniques such as discretizing the variation-of-constants formula (e.g., exponential integrators) or splitting the full equation into a series of simpler subproblems (e.g., splitting methods). In many situations these classical schemes allow a precise and efficient approximation. This, however, drastically changes whenever non-smooth phenomena enter the scene such as for problems at low regularity and high oscillations. Classical schemes fail to capture the oscillatory nature of the solution, and this may lead to severe instabilities and loss of convergence. In this talk I present a new class of resonance based schemes. The key idea in the construction of the new schemes is to tackle and deeply embed the underlying nonlinear  structure of resonances into the numerical discretization. As in the continuous case, these terms are central to structure preservation and offer the new schemes strong geometric properties at low regularity.

Thu, 12 Mar 2026

14:00 - 15:00
Lecture Room 3

TBA

Dr Anna Lisa Varri
(University of Edinburgh)
Abstract

TBA

Thu, 19 Mar 2026

14:00 - 15:00
(This talk is hosted by Rutherford Appleton Laboratory)

TBA

Dr Steph Folds
(University of Strathclyde)
Abstract

TBA; this talk is hosted at RAL. 

Thu, 30 Apr 2026

14:00 - 15:00
(This talk is hosted by Rutherford Appleton Laboratory)

TBA

Tobias Weinzierl
(Durham University)
Thu, 07 May 2026

14:00 - 15:00
Lecture Room 3

TBA

Po-Ling Loh
(Cambridge)
Abstract

TBA

Thu, 14 May 2026

14:00 - 15:00
Lecture Room 3

Numerical analysis of oscillatory solutions of compressible flows

Prof Dr Maria Lukacova
(Johannes Gutenberg University Mainz)
Abstract

Speaker Prof Dr Maria Lukacova will talk about 'Numerical analysis of oscillatory solutions of compressible flows'

 

Oscillatory solutions of compressible flows arise in many practical situations.  An iconic example is the Kelvin-Helmholtz problem, where standard numerical methods yield oscillatory solutions. In such a situation,  standard tools of numerical analysis for partial differential equations are not applicable. 

We will show that structure-preserving numerical methods converge in general to generalised solutions, the so-called dissipative solutions. 
The latter describes the limits of oscillatory sequences. We will concentrate on the inviscid flows, the Euler equations of gas dynamics, and mention also the relevant results obtained for the viscous compressible flows, governed by the Navier-Stokes equations.

We discuss a concept of K-convergence that turns a weak convergence of numerical solutions into the strong convergence of
their empirical means to a dissipative solution. The latter satisfies a weak formulation of the Euler equations modulo the Reynolds turbulent stress.  We will also discuss suitable selection criteria to recover well-posedness of the Euler equations of gas dynamics. Theoretical results will be illustrated by a series of numerical simulations.  

 

 

Thu, 21 May 2026

14:00 - 15:00
Lecture Room 3

TBA

Fernando De Teran
(University of Madrid Carlos III)
Abstract

TBA

Thu, 28 May 2026

14:00 - 15:00
Lecture Room 3

Reducing Sample Complexity in Stochastic Derivative-Free Optimization via Tail Bounds and Hypothesis Testing

Prof Luis Nunes Vicente
(Lehigh University)
Abstract

Professor Luis Nunes Vicente will talk about 'Reducing Sample Complexity in Stochastic Derivative-Free Optimization via Tail Bounds and Hypothesis Testing';

We introduce and analyze new probabilistic strategies for enforcing sufficient decrease conditions in stochastic derivative-free optimization, with the goal of reducing sample complexity and simplifying convergence analysis. First, we develop a new tail bound condition imposed on the estimated reduction in function value, which permits flexible selection of the power used in the sufficient decrease test, q in (1,2]. This approach allows us to reduce the number of samples per iteration from the standard O(delta^{−4}) to O(delta^{-2q}), assuming that the noise moment of order q/(q-1) is bounded. Second, we formulate the sufficient decrease condition as a sequential hypothesis testing problem, in which the algorithm adaptively collects samples until the evidence suffices to accept or reject a candidate step. This test provides statistical guarantees on decision errors and can further reduce the required sample size, particularly in the Gaussian noise setting, where it can approach O(delta^{−2-r}) when the decrease is of the order of delta^r. We incorporate both techniques into stochastic direct-search and trust-region methods for potentially non-smooth, noisy objective functions, and establish their global convergence rates and properties. 

This is joint work with Anjie Ding, Francesco Rinaldi, and Damiano Zeffiro.

 

Thu, 11 Jun 2026

14:00 - 15:00
Lecture Room 3

Optimization Algorithms for Bilevel Learning with Applications to Imaging

Dr Lindon Roberts
(Melbourne University)
Abstract

Dr Lindon Roberts will talk about: 'Optimization Algorithms for Bilevel Learning with Applications to Imaging'

Many imaging problems, such as denoising or inpainting, can be expressed as variational regularization problems. These are optimization problems for which many suitable algorithms exist. We consider the problem of learning suitable regularizers for imaging problems from example (training) data, which can be formulated as a large-scale bilevel optimization problem. 

In this talk, I will introduce new deterministic and stochastic algorithms for bilevel optimization, which require no or minimal hyperparameter tuning while retaining convergence guarantees. 

This is joint work with Mohammad Sadegh Salehi and Matthias Ehrhardt (University of Bath), and Subhadip Mukherjee (IIT Kharagpur).

 

 

Thu, 18 Jun 2026

14:00 - 15:00
Lecture Room 3

TBA

Daniele Boffi
(King Abdullah University of Science and Technology (KAUST))
Abstract

TBA