Wed, 11 Nov 2020
10:00
Virtual

Extending Leighton's Graph Covering Theorem

Sam Shepherd
(University of Oxford)
Abstract

Leighton's Theorem states that if two finite graphs have a common universal cover then they have a common finite cover. I will explore various ways in which this result can and can't be extended.

Fri, 30 Oct 2020
14:00
Virtual

Classifying Superconformal Defects in Diverse Dimensions

Yifan Wang
(Harvard)
Abstract

We explore general constraints from unitarity, defect superconformal symmetry and locality of bulk-defect couplings to classify possible superconformal defects in superconformal field theories (SCFT) of spacetime dimensions d>2.  Despite the general absence of locally conserved currents, the defect CFT contains new distinguished operators with protected quantum numbers that account for the broken bulk symmetries.  Consistency with the preserved superconformal symmetry and unitarity requires that such operators arrange into unitarity multiplets of the defect superconformal algebra, which in turn leads to nontrivial constraints on what kinds of defects are admissible in a given SCFT.  We will focus on the case of superconformal lines in this talk and comment on several interesting implications of our analysis, such as symmetry-enforced defect conformal manifolds, defect RG flows and possible nontrivial one-form symmetries in various SCFTs.  

Optimal growth of counter-rotating vortex pairs interacting with walls
Dehtyriov, D Hourigan, K Thompson, M Journal of Fluid Mechanics volume 904 a10 (10 Dec 2020)
Cross-device cross-anatomy adaptation network for ultrasound video analysis
Chen, Q Liu, Y Hu, Y Self, A Papageorghiou, A Noble, J Medical Ultrasound, and Preterm, Perinatal and Paediatric Image Analysis 42-51 (01 Oct 2020)
Fri, 20 Nov 2020

12:00 - 13:00

Selection Dynamics for Deep Neural Networks

Peter Markowich
(KAUST)
Abstract

We present a partial differential equation framework for deep residual neural networks and for the associated learning problem. This is done by carrying out the continuum limits of neural networks with respect to width and depth. We study the wellposedness, the large time solution behavior, and the characterization of the steady states of the forward problem. Several useful time-uniform estimates and stability/instability conditions are presented. We state and prove optimality conditions for the inverse deep learning problem, using standard variational calculus, the Hamilton-Jacobi-Bellmann equation and the Pontryagin maximum principle. This serves to establish a mathematical foundation for investigating the algorithmic and theoretical connections between neural networks, PDE theory, variational analysis, optimal control, and deep learning.

This is based on joint work with Hailiang Liu.

Fri, 13 Nov 2020

12:00 - 13:00

Computational Hardness of Hypothesis Testing and Quiet Plantings

Afonso Bandeira
(ETH Zurich)
Abstract

When faced with a data analysis, learning, or statistical inference problem, the amount and quality of data available fundamentally determines whether such tasks can be performed with certain levels of accuracy. With the growing size of datasets however, it is crucial not only that the underlying statistical task is possible, but also that is doable by means of efficient algorithms. In this talk we will discuss methods aiming to establish limits of when statistical tasks are possible with computationally efficient methods or when there is a fundamental «Statistical-to-Computational gap›› in which an inference task is statistically possible but inherently computationally hard. We will focus on Hypothesis Testing and the ``Low Degree Method'' and also address hardness of certification via ``quiet plantings''. Guiding examples will include Sparse PCA, bounds on the Sherrington Kirkpatrick Hamiltonian, and lower bounds on Chromatic Numbers of random graphs.

Fri, 06 Nov 2020

12:00 - 13:00

Bridging GANs and Stochastic Analysis

Haoyang Cao
(Alan Turing Institute)
Abstract

Generative adversarial networks (GANs) have enjoyed tremendous success in image generation and processing, and have recently attracted growing interests in other fields of applications. In this talk we will start from analyzing the connection between GANs and mean field games (MFGs) as well as optimal transport (OT). We will first show a conceptual connection between GANs and MFGs: MFGs have the structure of GANs, and GANs are MFGs under the Pareto Optimality criterion. Interpreting MFGs as GANs, on one hand, will enable a GANs-based algorithm (MFGANs) to solve MFGs: one neural network (NN) for the backward Hamilton-Jacobi-Bellman (HJB) equation and one NN for the Fokker-Planck (FP) equation, with the two NNs trained in an adversarial way. Viewing GANs as MFGs, on the other hand, will reveal a new and probabilistic aspect of GANs. This new perspective, moreover, will lead to an analytical connection between GANs and Optimal Transport (OT) problems, and sufficient conditions for the minimax games of GANs to be reformulated in the framework of OT. Building up from the probabilistic views of GANs, we will then establish the approximation of GANs training via stochastic differential equations and demonstrate the convergence of GANs training via invariant measures of SDEs under proper conditions. This stochastic analysis for GANs training can serve as an analytical tool to study its evolution and stability.

 
Higher-Order QCD Corrections to Higgs Boson Transverse-Momentum Distribution
Caola, F Kudashkin, K Lindert, J Melnikov, K Monni, P Tancredi, L Wever, C Frascati Physics Series volume 70 27-31 (01 Jan 2017)
The distance between the two BBM leaders
Berestycki, J Brunet, É Graham, C Mytnik, L Roquejoffre, J Ryzhik, L (20 Oct 2020)
Subscribe to