Oxford Mathematics in partnership with the Science Museum is delighted to announce its first Public Lecture in London. World-renowned mathematician Andrew Wiles will be our speaker. Andrew will be talking about his current work and will also be in conversation with mathematician and broadcaster Hannah Fry after the lecture. Attendance is free.
28th November, 6.30pm, Science Museum, London, SW7 2DD
Dual Acceleration for Nonconvex Optimisation
Abstract
The phenomenon of poor algorithmic scalability is a critical problem in large-scale machine learning and data science. This has led to a resurgence in the use of first-order (Hessian-free) algorithms from classical optimisation. One major drawback is that first-order methods tend to converge extremely slowly. However, there exist techniques for efficiently accelerating them.
The topic of this talk is the Dual Regularisation Nonlinear Acceleration algorithm (DRNA) (Geleta, 2017) for nonconvex optimisation. Numerical studies using the CUTEst optimisation problem set show the method to accelerate several nonconvex optimisation algorithms, including quasi-Newton BFGS and steepest descent methods. DRNA compares favourably with a number of existing accelerators in these studies.
DRNA extends to the nonconvex setting a recent acceleration algorithm due to Scieur et al. (Advances in Neural Information Processing Systems 29, 2016). We have proven theorems relating DRNA to the Kylov subspace method GMRES, as well as to Anderson's acceleration method and family of multi-secant quasi-Newton methods.
White Noise Coupling for Multilevel Monte Carlo
Abstract
In this talk we describe a new approach that enables the use of elliptic PDEs with white noise forcing to sample Matérn fields within the multilevel Monte Carlo (MLMC) framework.
When MLMC is used to quantify the uncertainty in the solution of PDEs with random coefficients, two key ingredients are needed: 1) a sampling technique for the coefficients that satisfies the MLMC telescopic sum and 2) a numerical solver for the forward PDE problem.
When the dimensionality of the uncertainty in the problem is infinite (i.e. coefficients are random fields), the sampling techniques commonly used in the literature are Karhunen–Loève expansions or circulant embeddings. In the specific case in which the coefficients are Gaussian fields of Mat ́ern covariance structure another sampling technique available relies on the solution of a linear elliptic PDE with white noise forcing.
When the finite element method (FEM) is used for the forward problem, the latter option can become advantageous as elliptic PDEs can be quickly and efficiently solved with the FEM, the sampling can be performed in parallel and the same FEM software can be used without the need for external packages. However, it is unclear how to enforce a good stochastic coupling of white noise between MLMC levels so as to respect the MLMC telescopic sum. In this talk we show how this coupling can be enforced in theory and in practice.
Oxford Mathematician Per-Gunnar Martinsson has been awarded the 2017 Germund Dahlquist Prize by the Society for Industrial and Applied Mathematics. The Germund Dahlquist Prize is awarded for original contributions to fields associated with Germund Dahlquist, especially the numerical solution of differential equations and numerical methods for scientific computing.