Wed, 25 Nov 2020
10:00
Virtual

Veering Triangulations, the Teichmüller Polynomial and the Alexander Polynomial

Anna Parlak
(University of Warwick)
Abstract

Veering triangulations are a special class of ideal triangulations with a rather mysterious combinatorial definition. Their importance follows from a deep connection with pseudo-Anosov flows on 3-manifolds. Recently Landry, Minsky and Taylor introduced a polynomial invariant of veering triangulations called the taut polynomial. It is a generalisation of an older invariant, the Teichmüller polynomial, defined by McMullen in 2002.

The aim of my talk is to demonstrate that veering triangulations provide a convenient setup for computations. More precisely, I will use fairly easy arguments to obtain a fairly strong statement which generalises the results of McMullen relating the Teichmüller polynomial to the Alexander polynomial.

I will not assume any prior knowledge on the Alexander polynomial, the Teichmüller polynomial or veering triangulations.

Wed, 18 Nov 2020
16:00
Virtual

Introduction to left-orderable groups and formal languages.

Hang Lu Su
(ICMAT Madrid)
Abstract

 

I will introduce left-orderable groups and discuss constructions and examples of such groups. I will then motivate studying left-orders under the framework of formal languages and discuss some recent results.

Wed, 11 Nov 2020
10:00
Virtual

Extending Leighton's Graph Covering Theorem

Sam Shepherd
(University of Oxford)
Abstract

Leighton's Theorem states that if two finite graphs have a common universal cover then they have a common finite cover. I will explore various ways in which this result can and can't be extended.

Fri, 30 Oct 2020
14:00
Virtual

Classifying Superconformal Defects in Diverse Dimensions

Yifan Wang
(Harvard)
Abstract

We explore general constraints from unitarity, defect superconformal symmetry and locality of bulk-defect couplings to classify possible superconformal defects in superconformal field theories (SCFT) of spacetime dimensions d>2.  Despite the general absence of locally conserved currents, the defect CFT contains new distinguished operators with protected quantum numbers that account for the broken bulk symmetries.  Consistency with the preserved superconformal symmetry and unitarity requires that such operators arrange into unitarity multiplets of the defect superconformal algebra, which in turn leads to nontrivial constraints on what kinds of defects are admissible in a given SCFT.  We will focus on the case of superconformal lines in this talk and comment on several interesting implications of our analysis, such as symmetry-enforced defect conformal manifolds, defect RG flows and possible nontrivial one-form symmetries in various SCFTs.  

Fri, 20 Nov 2020

12:00 - 13:00

Selection Dynamics for Deep Neural Networks

Peter Markowich
(KAUST)
Abstract

We present a partial differential equation framework for deep residual neural networks and for the associated learning problem. This is done by carrying out the continuum limits of neural networks with respect to width and depth. We study the wellposedness, the large time solution behavior, and the characterization of the steady states of the forward problem. Several useful time-uniform estimates and stability/instability conditions are presented. We state and prove optimality conditions for the inverse deep learning problem, using standard variational calculus, the Hamilton-Jacobi-Bellmann equation and the Pontryagin maximum principle. This serves to establish a mathematical foundation for investigating the algorithmic and theoretical connections between neural networks, PDE theory, variational analysis, optimal control, and deep learning.

This is based on joint work with Hailiang Liu.

Fri, 13 Nov 2020

12:00 - 13:00

Computational Hardness of Hypothesis Testing and Quiet Plantings

Afonso Bandeira
(ETH Zurich)
Abstract

When faced with a data analysis, learning, or statistical inference problem, the amount and quality of data available fundamentally determines whether such tasks can be performed with certain levels of accuracy. With the growing size of datasets however, it is crucial not only that the underlying statistical task is possible, but also that is doable by means of efficient algorithms. In this talk we will discuss methods aiming to establish limits of when statistical tasks are possible with computationally efficient methods or when there is a fundamental «Statistical-to-Computational gap›› in which an inference task is statistically possible but inherently computationally hard. We will focus on Hypothesis Testing and the ``Low Degree Method'' and also address hardness of certification via ``quiet plantings''. Guiding examples will include Sparse PCA, bounds on the Sherrington Kirkpatrick Hamiltonian, and lower bounds on Chromatic Numbers of random graphs.

Fri, 06 Nov 2020

12:00 - 13:00

Bridging GANs and Stochastic Analysis

Haoyang Cao
(Alan Turing Institute)
Abstract

Generative adversarial networks (GANs) have enjoyed tremendous success in image generation and processing, and have recently attracted growing interests in other fields of applications. In this talk we will start from analyzing the connection between GANs and mean field games (MFGs) as well as optimal transport (OT). We will first show a conceptual connection between GANs and MFGs: MFGs have the structure of GANs, and GANs are MFGs under the Pareto Optimality criterion. Interpreting MFGs as GANs, on one hand, will enable a GANs-based algorithm (MFGANs) to solve MFGs: one neural network (NN) for the backward Hamilton-Jacobi-Bellman (HJB) equation and one NN for the Fokker-Planck (FP) equation, with the two NNs trained in an adversarial way. Viewing GANs as MFGs, on the other hand, will reveal a new and probabilistic aspect of GANs. This new perspective, moreover, will lead to an analytical connection between GANs and Optimal Transport (OT) problems, and sufficient conditions for the minimax games of GANs to be reformulated in the framework of OT. Building up from the probabilistic views of GANs, we will then establish the approximation of GANs training via stochastic differential equations and demonstrate the convergence of GANs training via invariant measures of SDEs under proper conditions. This stochastic analysis for GANs training can serve as an analytical tool to study its evolution and stability.

 
Thu, 19 Nov 2020

17:00 - 18:00
Virtual

Oxford Mathematics Online Public Lecture: Anna Seigal - Ideas for a Complex World

Anna Seigal
(University of Oxford)
Further Information

Humans have been processing information in the world for a long time, finding patterns and learning from our surroundings to solve problems. Today, scientists make sense of complex problems by gathering vast amounts of data, and analysing them with quantitative methods. These methods are important tools to understand the issues facing us: the spread of disease, climate change, or even political movements. But this quantitative toolbox can seem far removed from our individual approaches for processing information in our day-to-day lives. This disconnect and inaccessibility leads to the scientific tools becoming entangled in politics and questions of trust.

In this talk, Anna will describe how some of the ideas at the heart of science’s quantitative tools are familiar to us all. We’ll see how mathematics enables us to turn the ideas into tools. As a society, if we can better connect with the ideas driving this toolbox, we can see when to use (and not to use) the available tools, what’s missing from the toolbox, and how we might come up with new ideas to drive our future understanding of the world around us.

Anna Seigal is a Hooke Research Fellow in the Mathematical Institute at the University of Oxford and a Junior Research Fellow at The Queen's College.

Watch live (no need to register):
Oxford Mathematics Twitter
Oxford Mathematics Facebook
Oxford Mathematics Livestream
Oxford Mathematics YouTube

The Oxford Mathematics Public Lectures are generously supported by XTX Markets.

Tue, 26 Jan 2021

14:00 - 15:00
Virtual

Core-Periphery Structure in Directed Networks

Gesine Reinert
(University of Oxford)
Abstract

Empirical networks often exhibit different meso-scale structures, such as community and core-periphery structure. Core-periphery typically consists of a well-connected core, and a periphery that is well-connected to the core but sparsely connected internally. Most core-periphery studies focus on undirected networks. In this talk we discuss  a generalisation of core-periphery to directed networks which  yields a family of core-periphery blockmodel formulations in which, contrary to many existing approaches, core and periphery sets are edge-direction dependent. Then we shall  focus on a particular structure consisting of two core sets and two periphery sets, and we introduce  two measures to assess the statistical significance and quality of this  structure in empirical data, where one often has no ground truth. The idea will be illustrated on three empirical networks --  faculty hiring, a world trade data-set, and political blogs.

 

This is based on joint work with Andrew Elliott, Angus Chiu, Marya Bazzi and Mihai Cucuringu, available at https://royalsocietypublishing.org/doi/pdf/10.1098/rspa.2019.0783

Subscribe to