Lie Polynomials and a Twistorial Correspondence for Amplitudes
Frost, H Mason, L (09 Dec 2019)
Recursion and worldsheet formulae for 6d superamplitudes
Albonico, G Geyer, Y Mason, L (16 Jan 2020)
Gluon scattering on self-dual radiative gauge fields
Adamo, T Mason, L Sharma, A (28 Oct 2020)
A Lie bracket for the momentum kernel
Frost, H Mafra, C Mason, L (01 Dec 2020)
Ambitwistor Strings in Six and Five Dimensions
Geyer, Y Mason, L Skinner, D (30 Dec 2020)
Twistor sigma models for quaternionic geometry and graviton scattering
Adamo, T Mason, L Sharma, A (31 Mar 2021)
Tue, 07 May 2024
15:00
L6

Oka manifolds and their role in complex analysis and geometry

Franc Forstneric
Abstract

Oka theory is about the validity of the h-principle in complex analysis and geometry. In this expository lecture, I will trace its main developments, from the classical results of Kiyoshi Oka (1939) and Hans Grauert (1958), through the seminal work of Mikhail Gromov (1989), to the introduction of Oka manifolds (2009) and the present state of knowledge. The lecture does not assume any prior exposure to this theory.

Model Integration in Computational Biology: The Role of Reproducibility, Credibility and Utility.
Karr, J Malik-Sheriff, R Osborne, J Gonzalez-Parra, G Forgoston, E Bowness, R Liu, Y Thompson, R Garira, W Barhak, J Rice, J Torres, M Dobrovolny, H Tang, T Waites, W Glazier, J Faeder, J Kulesza, A Frontiers in systems biology volume 2 822606 (07 Jan 2022)
Mon, 06 May 2024

14:00 - 15:00
Lecture Room 3

Bayesian Interpolation with Linear and Shaped Neural Networks

Boris Hanin
(Princeton University)
Abstract

This talk, based on joint work with Alexander Zlokapa, concerns Bayesian inference with neural networks. 

I will begin by presenting a result giving exact non-asymptotic formulas for Bayesian posteriors in deep linear networks. A key takeaway is the appearance of a novel scaling parameter, given by # data * depth / width, which controls the effective depth of the posterior in the limit of large model and dataset size. 

Additionally, I will explain some quite recent results on the role of this effective depth parameter in Bayesian inference with deep non-linear neural networks that have shaped activations.

Mon, 04 Mar 2024

14:00 - 15:00
Lecture Room 3

On transport methods for simulation-based inference and data assimilation

Prof Youssef Marzouk
(MIT)
Abstract

Many practical Bayesian inference problems fall into the simulation-based or "likelihood-free" setting, where evaluations of the likelihood function or prior density are unavailable or intractable; instead one can only draw samples from the joint parameter-data prior. Learning conditional distributions is essential to the solution of these problems. 
To this end, I will discuss a powerful class of methods for conditional density estimation and conditional simulation based on transportation of measure. An important application for these methods lies in data assimilation for dynamical systems, where transport enables new approaches to nonlinear filtering and smoothing. 
To illuminate some of the theoretical underpinnings of these methods, I will discuss recent work on monotone map representations, optimization guarantees for learning maps from data, and the statistical convergence of transport-based density estimators.
 

Subscribe to