Thu, 04 Nov 2021
14:00
L4

Rational approximation and beyond, or, What I did during the pandemic

Nick Trefethen
(Mathematical Institute (University of Oxford))
Abstract

The past few years have been an exciting time for my work related to rational approximation.  This talk will present four developments:

1. AAA approximation (2016, with Nakatsukasa & Sète)
2. Root-exponential convergence and tapered exponential clustering (2020, with Nakatsukasa & Weideman)
3. Lightning (2017-2020, with Gopal & Brubeck)
4. Log-lightning (2020-21, with Nakatsukasa & Baddoo)

Two other topics will not be discussed:

X. AAA-Lawson approximation (2018, with Nakatsukasa)
Y. AAA-LS approximation (2021, with Costa)

Tue, 01 Jun 2021

12:45 - 13:30

Neural Controlled Differential Equations for Online Prediction Tasks

James Morrill
(Mathematical Institute (University of Oxford))
Abstract

Neural controlled differential equations (Neural CDEs) are a continuous-time extension of recurrent neural networks (RNNs). They are considered SOTA for modelling functions on irregular time series, outperforming other ODE benchmarks (ODE-RNN, GRU-ODE-Bayes) in offline prediction tasks. However, current implementations are not suitable to be used in online prediction tasks, severely restricting the domains of applicability of this powerful modeling framework. We identify such limitations with previous implementations and show how said limitations may be addressed, most notably to allow for online predictions. We benchmark our online Neural CDE model on three continuous monitoring tasks from the MIMIC-IV ICU database, demonstrating improved performance on two of the three tasks against state-of-the-art (SOTA) non-ODE benchmarks, and improved performance on all tasks against our ODE benchmark.

 

Joint work with Patrick Kidger, Lingyi Yang, and Terry Lyons.

Thu, 17 Jun 2021

14:00 - 15:00
Virtual

Wilson Loops, Cusps and Holography

Pietro Ferrero
(Mathematical Institute (University of Oxford))
Further Information

Contact organisers (Carmen Jorge-Diaz, Sujay Nair or Connor Behan) to obtain the link. 

Thu, 10 Jun 2021

14:00 - 15:00
Virtual

Random Matrices and JT Gravity

Carmen Jorge-Diaz
(Mathematical Institute (University of Oxford))
Further Information

Contact organisers (Carmen Jorge-Diaz, Sujay Nair or Connor Behan) to obtain the link. 

Thu, 03 Jun 2021

14:00 - 15:00
Virtual

Topological QFTs (Part II)

Marieke Van Beest and Sujay Nair
(Mathematical Institute (University of Oxford))
Further Information

Contact organisers (Carmen Jorge-Diaz, Sujay Nair or Connor Behan) to obtain the link. 

Thu, 27 May 2021

14:00 - 15:00
Virtual

Topological QFTs (Part I)

Marieke Van Beest and Sujay Nair
(Mathematical Institute (University of Oxford))
Further Information

Contact organisers (Carmen Jorge-Diaz, Sujay Nair or Connor Behan) to obtain the link. 

Thu, 20 May 2021

14:00 - 15:00
Virtual

Invariants of 4-Manifolds

Horia Magureanu
(Mathematical Institute (University of Oxford))
Further Information

Contact organisers (Carmen Jorge-Diaz, Sujay Nair or Connor Behan) to obtain the link. 

Thu, 06 May 2021

14:00 - 15:00
Virtual

Constructor Theory

Maria Violaris
(Mathematical Institute (University of Oxford))
Further Information

Contact organisers (Carmen Jorge-Diaz, Sujay Nair or Connor Behan) to obtain the link. 

Tue, 04 May 2021

12:45 - 13:30

Computing the Index of Saddle Points without Second Derivatives

Ambrose Yim
(Mathematical Institute (University of Oxford))
Abstract

The index of a saddle point of a smooth function is the number of descending directions of the saddle. While the index can usually be retrieved by counting the number of negative eigenvalues of the Hessian at the critical point, we may not have the luxury of having second derivatives in data deriving from practical applications. To address this problem, we develop a computational pipeline for estimating the index of a non-degenerate saddle point without explicitly computing the Hessian. In our framework, we only require a sufficiently dense sample of level sets of the function near the saddle point. Using techniques in Morse theory and Topological Data Analysis, we show how the shape of saddle points can help us infer the index of the saddle. Furthermore, we derive an explicit upper bound on the density of point samples necessary for inferring the index depending on the curvature of level sets. 

Subscribe to Mathematical Institute (University of Oxford)