Thu, 16 Jun 2022

14:00 - 15:00
L2

Factorization in AdS/CFT

Carmen Jorge Diaz
((Oxford University))
Abstract
Junior Strings is a seminar series where DPhil students present topics of common interest that do not necessarily overlap with their own research area. This is primarily aimed at PhD students and post-docs but everyone is welcome
Wed, 01 Jun 2022

10:30 - 17:30
L2

One-Day Meeting in Combinatorics

Multiple
Further Information

The speakers are Gabor Lugosi (Barcelona), Gal Kronenberg (Oxford), Paul Balister (Oxford), Julia Wolf (Cambridge), and David Wood (Monash). Please see the event website for further details including titles, abstracts, and timings. Anyone interested is welcome to attend, and no registration is required.

Mon, 23 May 2022

15:30 - 16:30
L2

"Constructing global solutions to energy supercritical PDEs"

MOUHAMADOU SY
((Imperial College, London))
Abstract

 "In this talk, we will discuss invariant measures techniques to establish probabilistic global well-posedness for PDEs. We will go over the limitations that the Gibbs measures and the so-called fluctuation-dissipation measures encounter in the context of energy-supercritical PDEs. Then, we will present a new approach combining the two aforementioned methods and apply it to the energy supercritical Schrödinger equations. We will point out other applications as well."

Mon, 16 May 2022

15:30 - 16:30
L2

Mean field games with common noise and arbitrary utilities

THALEIA ZARIPHOPOULOU
(Univerity of Texas at Austin)
Abstract

I will introduce a class of mean-field games under forward performance and for general risk preferences. Players interact through competition in fund management, driven by relative performance concerns in an asset diversification setting. This results in a common-noise mean field game. I will present the value and the optimal policies of such games, as well as some concrete examples. I will also discuss the partial information case, i.e.. when the risk premium is not directly observed. 

Fri, 27 May 2022

15:00 - 16:00
L2

The nonlinear stability of Kerr for small angular momentum

Sergiu Klainerman
(Princeton)
Abstract

I will report on my most recent results  with Jeremie Szeftel and Elena Giorgi which conclude the proof of the nonlinear, unconditional, stability of slowly rotating Kerr metrics. The main part of the proof, announced last year, was conditional on results concerning boundedness and decay estimates for nonlinear wave equations. I will review the old results and discuss how the conditional results can now be fully established.

Thu, 16 Jun 2022

12:00 - 13:00
L2

Repulsive Geometry

Keenan Crane
(Carnegie Mellon Univeristy, School of Computer Science)
Further Information

 

Keenan Crane is the Michael B. Donohue Associate Professor in the School of Computer Science at Carnegie Mellon University, and a member of the Center for Nonlinear Analysis in the Department of Mathematical Sciences.  He is a Packard Fellow and recipient of the NSF CAREER Award, was a Google PhD Fellow in the Department of Computing and Mathematical Sciences at Caltech, and was an NSF Mathematical Postdoctoral Research Fellow at Columbia University.  His work applies insights from differential geometry and computer science to develop fundamental algorithms for working with real-world geometric data.  This work has been used in production at Fortune 500 companies, and featured in venues such as Communications of the ACM and Notices of the AMS, as well as in the popular press through outlets such as WIRED, Popular Mechanics, National Public Radio, and Scientific American.

Abstract

Numerous applications in geometric, visual, and scientific computing rely on the ability to nicely distribute points in space according to a repulsive potential.  In contrast, there has been relatively little work on equidistribution of higher-dimensional geometry like curves and surfaces—which in many contexts must not pass through themselves or each other.  This talk explores methods for optimization of curve and surface geometry while avoiding (self-)collision. The starting point is the tangent-point energy of Buck & Orloff, which penalizes pairs of points that are close in space but distant with respect to geodesic distance. We develop a discretization of this energy, and introduce a novel preconditioning scheme based on a fractional Sobolev inner product.  We further accelerate this scheme via hierarchical approximation, and describe how to incorporate into a constrained optimization framework. Finally, we explore how this machinery can be applied to problems in mathematical visualization, geometric modeling, and geometry processing.

 

 

Fri, 20 May 2022

16:00 - 17:00
L2

New perspectives for higher-order methods in convex optimisation

Yurii Nesterov
(Universite catholique de louvain)
Further Information

This colloquium is the annual Maths-Stats colloquium, held jointly with the Statistics department.

Abstract
In the recent years, the most important developments in Optimization were related to clarification of abilities of the higher-order methods. These schemes have potentially much higher rate of convergence as compared to the lower-order methods. However, the possibility of their implementation in the form of practically efficient algorithms was questionable during decades. In this talk, we discuss different possibilities for advancing in this direction, which avoid all standard fears on tensor methods (memory requirements, complexity of computing the tensor components, etc.). Moreover, in this way we get the new second-order methods with memory, which converge provably faster than the conventional upper limits provided by the Complexity Theory.
Fri, 13 May 2022

15:00 - 16:00
L2

Non-Euclidean Data Analysis (and a lot of questions)

John Aston
(University of Cambridge)
Abstract

The statistical analysis of data which lies in a non-Euclidean space has become increasingly common over the last decade, starting from the point of view of shape analysis, but also being driven by a number of novel application areas. However, while there are a number of interesting avenues this analysis has taken, particularly around positive definite matrix data and data which lies in function spaces, it has increasingly raised more questions than answers. In this talk, I'll introduce some non-Euclidean data from applications in brain imaging and in linguistics, but spend considerable time asking questions, where I hope the interaction of statistics and topological data analysis (understood broadly) could potentially start to bring understanding into the applications themselves.

Fri, 13 May 2022

10:00 - 11:00
L2

Generalizing the fast Fourier transform to handle missing input data

Keith Briggs
(BT)
Abstract

The discrete Fourier transform is fundamental in modern communication systems.  It is used to generate and process (i.e. modulate and demodulate) the signals transmitted in 4G, 5G, and wifi systems, and is always implemented by one of the fast Fourier transforms (FFT) algorithms.  It is possible to generalize the FFT to work correctly on input vectors with periodic missing values.   I will consider whether this has applications, such as more general transmitted signal waveforms, or further applications such as spectral density estimation for time series with missing data.  More speculatively, can we generalize to "recursive" missing values, where the non-missing blocks have gaps?   If so, how do we optimally recognize such a pattern in a given time series?

Fri, 10 Jun 2022

16:00 - 17:00
L2

Maths Meets Stats

Melanie Weber and Francesca Panero
Abstract

Melanie Weber 

Title: Geometric Methods for Machine Learning and Optimization

Abstract: A key challenge in machine learning and optimization is the identification of geometric structure in high-dimensional data. Such structural understanding is of great value for the design of efficient algorithms and for developing fundamental guarantees for their performance. Motivated by the observation that many applications involve non-Euclidean data, such as graphs, strings, or matrices, we discuss how Riemannian geometry can be exploited in Machine Learning and Optimization. First, we consider the task of learning a classifier in hyperbolic space. Such spaces have received a surge of interest for representing large-scale, hierarchical data, since they achieve better representation accuracy with fewer dimensions. Secondly, we consider the problem of optimizing a function on a Riemannian manifold. Specifically, we will consider classes of optimization problems where exploiting Riemannian geometry can deliver algorithms that are computationally superior to standard (Euclidean) approaches.

 

Francesca Panero

Title: A general overview of the different projects explored during my DPhil in Statistics.

Abstract: In the first half of the talk, I will present my work on statistical models for complex networks. I will propose a model to describe sparse spatial random graph underpinned by the Bayesian nonparametric theory and asymptotic properties of a more general class of these models, regarding sparsity, degree distribution and clustering coefficients.

The second half will be devoted to the statistical quantification of the risk of disclosure, a quantity used to evaluate the level of privacy that can be achieved by publishing a microdata file without modifications. I propose two ways to estimate the risk of disclosure, using both frequentist and Bayes nonparametric statistics.

 

Subscribe to L2