Fri, 08 Mar 2024

15:00 - 16:00
L6

Topological Perspectives to Characterizing Generalization in Deep Neural Networks

Tolga Birdal
(Imperial College)
Further Information

 

Dr. Tolga Birdal is an Assistant Professor in the Department of Computing at Imperial College London, with prior experience as a Senior Postdoctoral Research Fellow at Stanford University in Prof. Leonidas Guibas's Geometric Computing Group. Tolga has defended his master's and Ph.D. theses at the Computer Vision Group under Chair for Computer Aided Medical Procedures, Technical University of Munich led by Prof. Nassir Navab. He was also a Doktorand at Siemens AG under supervision of Dr. Slobodan Ilic working on “Geometric Methods for 3D Reconstruction from Large Point Clouds”. His research interests center on geometric machine learning and 3D computer vision, with a theoretical focus on exploring the boundaries of geometric computing, non-Euclidean inference, and the foundations of deep learning. Dr. Birdal has published extensively in leading academic journals and conference proceedings, including NeurIPS, CVPR, ICLR, ICCV, ECCV, T-PAMI, and IJCV. Aside from his academic life, Tolga has co-founded multiple companies including Befunky, a widely used web-based image editing platform.

Abstract

 

Training deep learning models involves searching for a good model over the space of possible architectures and their parameters. Discovering models that exhibit robust generalization to unseen data and tasks is of paramount for accurate and reliable machine learning. Generalization, a hallmark of model efficacy, is conventionally gauged by a model's performance on data beyond its training set. Yet, the reliance on vast training datasets raises a pivotal question: how can deep learning models transcend the notorious hurdle of 'memorization' to generalize effectively? Is it feasible to assess and guarantee the generalization prowess of deep neural networks in advance of empirical testing, and notably, without any recourse to test data? This inquiry is not merely theoretical; it underpins the practical utility of deep learning across myriad applications. In this talk, I will show that scrutinizing the training dynamics of neural networks through the lens of topology, specifically using 'persistent-homology dimension', leads to novel bounds on the generalization gap and can help demystifying the inner workings of neural networks. Our work bridges deep learning with the abstract realms of topology and learning theory, while relating to information theory through compression.

 

Thu, 07 Mar 2024
12:00
L6

Well-posedness of nonlocal aggregation-diffusion equations and systems with irregular kernels

Yurij Salmaniw
(Mathematical Institute, University of Oxford)
Abstract

Aggregation-diffusion equations and systems have garnered much attention in the last few decades. More recently, models featuring nonlocal interactions through spatial convolution have been applied to several areas, including the physical, chemical, and biological sciences. Typically, one can establish the well-posedness of such models via regularity assumptions on the kernels themselves; however, more effort is required for many scenarios of interest as the nonlocal kernel is often discontinuous.

 

In this talk, I will present recent progress in establishing a robust well-posedness theory for a class of nonlocal aggregation-diffusion models with minimal regularity requirements on the interaction kernel in any spatial dimension on either the whole space or the torus. Starting with the scalar equation, we first establish the existence of a global weak solution in a small mass regime for merely bounded kernels. Under some additional hypotheses, we show the existence of a global weak solution for any initial mass. In typical cases of interest, these solutions are unique and classical. I will then discuss the generalisation to the $n$-species system for the regimes of small mass and arbitrary mass. We will conclude with some consequences of these theorems for several models typically found in ecological applications.

 

This is joint work with Dr. Jakub Skrzeczkowski and Prof. Jose Carrillo.

Wed, 07 Feb 2024
12:00
L6

Pressure jump in the Cahn-Hilliard equation

Charles Elbar
(Laboratoire Jacques Louis Lions, Sorbonne Université)
Abstract

We model a tumor as an incompressible flow considering two antagonistic effects: repulsion of cells when the tumor grows (they push each other when they divide) and cell-cell adhesion which creates surface tension. To take into account these two effects, we use a 4th-order parabolic equation: the Cahn-Hilliard equation. The combination of these two effects creates a discontinuity at the boundary of the tumor that we call the pressure jump.  To compute this pressure jump, we include an external force and consider stationary radial solutions of the Cahn-Hilliard equation. We also characterize completely the stationary solutions in the incompressible case, prove the incompressible limit and prove convergence of the parabolic problems to stationary states.

Wed, 17 Jan 2024
12:00
L6

A new understanding of the grazing limit

Prof Tong Yang
(Department of Applied Mathematics, The Hong Kong Polytechnic University)
Abstract

The grazing limit of the Boltzmann equation to Landau equation is well-known and has been justified by using cutoff near the grazing angle with some suitable scaling. In this talk, we will present a new approach by applying a natural scaling on the Boltzmann equation. The proof is based on an improved well-posedness theory for the Boltzmann equation without angular cutoff in the regime with an optimal range of parameters so that the grazing limit can be justified directly that includes the Coulomb potential. With this new understanding, the scaled Boltzmann operator in fact can be decomposed into two parts. The first one converges to the Landau operator when the parameter of deviation angle tends to its singular value and the second one vanishes in the limit. Hence, the scaling and limiting process exactly capture the grazing collisions. The talk is based on a recent joint work with Yu-Long Zhou.

Wed, 28 Feb 2024
12:00
L6

Non-regular spacetime geometry, curvature and analysis

Clemens Saemann
(Mathematical Institute, University of Oxford)
Abstract

I present an approach to Lorentzian geometry and General Relativity that does neither rely on smoothness nor
on manifolds, thereby leaving the framework of classical differential geometry. This opens up the possibility to study
curvature (bounds) for spacetimes of low regularity or even more general spaces. An analogous shift in perspective
proved extremely fruitful in the Riemannian case (Alexandrov- and CAT(k)-spaces). After introducing the basics of our
approach, we report on recent progress in developing a Sobolev calculus for time functions on such non-smooth
Lorentzian spaces. This seminar talk can also be viewed as a primer and advertisement for my mini course in
May: Current topics in Lorentzian geometric analysis: Non-regular spacetimes

Tue, 30 Jan 2024

16:00 - 17:00
L6

Characteristic polynomials, the Hybrid model, and the Ratios Conjecture

Andrew Pearce-Crump
(University of York)
Abstract

In the 1960s Shanks conjectured that the  ζ'(ρ), where ρ is a non-trivial zero of zeta, is both real and positive in the mean. Conjecturing and proving this result has a rich history, but efforts to generalise it to higher moments have so far failed. Building on the work of Keating and Snaith using characteristic polynomials from Random Matrix Theory, the Hybrid model of Gonek, Hughes and Keating, and the Ratios Conjecture of Conrey, Farmer, and Zirnbauer, we have been able to produce new conjectures for the full asymptotics of higher moments of the derivatives of zeta. This is joint work with Chris Hughes.

Tue, 16 Jan 2024

16:00 - 17:00
L6

Branching selection particle systems and the selection principle.

Julien Berestycki
(Department of Statistics, University of Oxford)
Abstract
The $N$-branching Brownian motion with selection ($N$-BBM) is a particle system consisting of $N$ independent particles that diffuse as Brownian motions in $\mathbb{R}$, branch at rate one, and whose size is kept constant by removing the leftmost particle at each branching event. It is a very simple model for the evolution of a population under selection that has generated some fascinating research since its introduction by Brunet and Derrida in the early 2000s.
 
If one recentre the positions by the position of the left most particle, this system has a stationary distribution. I will show that, as $N\to\infty$ the stationary empirical measure of the $N$-particle system converges to the minimal travelling wave of an associated free boundary PDE. This resolves an open question going back at least to works of e.g. Maillard in 2012.
It follows a recent related result by Oliver Tough (with whom this is joint work) establishing a similar selection principle for the so-called Fleming-Viot particle system.
 
With very best wishes,
Julien
Tue, 05 Mar 2024
16:00
L6

Hybrid Statistics of the Maxima of a Random Model of the Zeta Function over Short Intervals

Christine Chang
(CUNY Graduate Center)
Abstract

We will present a matching upper and lower bound for the right tail probability of the maximum of a random model of the Riemann zeta function over short intervals.  In particular, we show that the right tail interpolates between that of log-correlated and IID random variables as the interval varies in length. We will also discuss a new normalization for the moments over short intervals. This result follows the recent work of Arguin-Dubach-Hartung and is inspired by a conjecture by Fyodorov-Hiary-Keating on the local maximum over short intervals.



 

Tue, 27 Feb 2024
16:00
L6

Dynamics in interlacing arrays, conditioned walks and the Aztec diamond

Theodoros Assiotis
(University of Edinburgh)
Abstract

I will discuss certain dynamics of interacting particles in interlacing arrays with inhomogeneous, in space and time, jump probabilities and their relations to conditioned random walks and random tilings of the Aztec diamond.

Subscribe to L6