Wed, 26 Oct 2022
16:00
L4

$\ell^2$ and profinite invariants

Ismael Morales
(University of Oxford)
Abstract

We review a few instances in which the first $\ell^2$ Betti number of a group is a profinite invariant and we discuss some applications and open problems.

Wed, 19 Oct 2022
16:00
L4

$\ell^2$-invariants and generalisations in positive characteristic

Sam Fisher
(University of Oxford)
Abstract

We survey the theory of $\ell^2$-invariants, their applications in group theory and topology, and introduce a positive characteristic version of $\ell^2$-theory. We also discuss the Atiyah and Lück approximation conjectures, two of the central problems in this area.

Wed, 12 Oct 2022
16:00
L4

Profinite Rigidity

Paweł Piwek
(University of Oxford)
Abstract

Profinite rigidity is essentially the study of which groups can be distinguished from each other by their finite quotients. This talk is meant to give a gentle introduction to the area - I will explain which questions are the right ones to ask and give an overview of some of the main results in the field. I will assume knowledge of what a group presentation is.

Mon, 14 Nov 2022
14:00
L4

A dynamical system perspective of optimization in data science

Jalal Fadili
(CNRS-ENSICAEN-Université Caen)
Abstract

In this talk, I will discuss and introduce deep insight from the dynamical system perspective to understand the convergence guarantees of first-order algorithms involving inertial features for convex optimization in a Hilbert space setting.

Such algorithms are widely popular in various areas of data science (data processing, machine learning, inverse problems, etc.).
They can be viewed discrete as time versions of an inertial second-order dynamical system involving different types of dampings (viscous damping,  Hessian-driven geometric damping).

The dynamical system perspective offers not only a powerful way to understand the geometry underlying the dynamic, but also offers a versatile framework to obtain fast, scalable and new algorithms enjoying nice convergence guarantees (including fast rates). In addition, this framework encompasses known algorithms and dynamics such as the Nesterov-type accelerated gradient methods, and the introduction of time scale factors makes it possible to further accelerate these algorithms. The framework is versatile enough to handle non-smooth and non-convex objectives that are ubiquituous in various applications.

Mon, 31 Oct 2022
14:00
L4

Stochastic methods for derivative free optimization

Stephen Becker
(University of Colorado Boulder)
Abstract

Numerical optimization is an indispensable tool of modern data analysis, and there are many optimization problems where it is difficult or impossible to compute the full gradient of the objective function. The field of derivative free optimization (DFO) addresses these cases by using only function evaluations, and has wide-ranging applications from hyper-parameter tuning in machine learning to PDE-constrained optimization.

We present two projects that attempt to scale DFO techniques to higher dimensions.  The first method converges slowly but works in very high dimensions, while the second method converges quickly but doesn't scale quite as well with dimension.  The first-method is a family of algorithms called "stochastic subspace descent" that uses a few directional derivatives at every step (i.e. projections of the gradient onto a random subspace). In special cases it is related to Spall's SPSA, Gaussian smoothing of Nesterov, and block-coordinate descent. We provide convergence analysis and discuss Johnson-Lindenstrauss style concentration.  The second method uses conventional interpolation-based trust region methods which require large ill-conditioned linear algebra operations.  We use randomized linear algebra techniques to ameliorate the issues and scale to larger dimensions; we also use a matrix-free approach that reduces memory issues.  These projects are in collaboration with David Kozak, Luis Tenorio, Alireza Doostan, Kevin Doherty and Katya Scheinberg.

Mon, 10 Oct 2022
14:00
L4

Partitioned and multirate training of neural networks

Ben Leimkuhler
(Edinburgh University)
Abstract

I will discuss the use of partitioned schemes for neural networks. This work is in the tradition of multrate numerical ODE methods in which different components of system are evolved using different numerical methods or with different timesteps. The setting is the training tasks in deep learning in which parameters of a hierarchical model must be found to describe a given data set. By choosing appropriate partitionings of the parameters some redundant computation can be avoided and we can obtain substantial computational speed-up. I will demonstrate the use of the procedure in transfer learning applications from image analysis and natural language processing, showing a reduction of around 50% in training time, without impairing the generalization performance of the resulting models. This talk describes joint work with Tiffany Vlaar.

Mon, 20 Jun 2022

12:45 - 13:45
L4

Large N Partition Functions, Holography, and Black Holes

Nikolay Bobev
Abstract

I will discuss the large N behavior of partition functions of the ABJM theory on compact Euclidean manifolds. I will pay particular attention to the S^3 free energy and the topologically twisted index for which I will present closed form expressions valid to all order in the large N expansion. These results have important implications for holography and the microscopic entropy counting of AdS_4 black holes which I will discuss. I will also briefly discuss generalizations to other SCFTs arising from M2-branes.

Thu, 16 Jun 2022

16:00 - 17:00
L4

Ax-Schanuel and exceptional integrability

Jonathan Pila
(University of Oxford)
Abstract

In joint work with Jacob Tsimerman we study when the primitive
of a given algebraic function can be constructed using primitives
from some given finite set of algebraic functions, their inverses,
algebraic functions, and composition. When the given finite set is just {1/x}
this is the classical problem of "elementary integrability".
We establish some results, including a decision procedure for this problem.

Fri, 17 Jun 2022

10:00 - 11:00
L4

Silt build up at Peel Ports locks

David Porter (Carbon Limiting Technologies), Chris Breward, Daniel Alty (Peel Ports; joining remotely)
(Peel Ports)
Abstract

Peel Ports operate a number of locks that allow ships to enter and leave the port. The lock gates comprise a single caisson structure which blocks the waterway when closed and retracts into the dockside as the gate opens. Build up of silt ahead of the opening lock gate can prevent it from fully opening or requiring excessive power to move. If the lock is not able to fully open, ships are unable to enter the port, leading to significant operational impacts for the whole port. Peel ports are interested in understanding, and mitigating, this silt build up. 

Wed, 18 May 2022

12:45 - 14:00
L4

A pedestrian introduction to the geometry of 3d twisted indices

Andrea Ferrari
(Durham)
Abstract

3d N=4 gauge theories can be studied on a circle times a closed Riemann surface. Their partition functions on this geometry, known as twisted indices, were computed some time ago using supersymmetric localisation on the Coulomb branch. An alternative perspective is to consider the theory as a supersymmetric quantum mechanics on S^1. In this talk I will review this point of view, which unveils interesting connection to topics in geometry such as wall-crossing and symplectic duality of quasi-maps.

Further Information

Please note the unusual time.

Subscribe to L4