Fri, 30 Oct 2020

12:00 - 13:00

Neural differential equations in machine learning

Patrick Kidger
(Oxford Mathematics)
Abstract

Differential equations and neural networks are two of the most widespread modelling paradigms. I will talk about how to combine the best of both worlds through neural differential equations. These treat differential equations as a learnt component of a differentiable computation graph, and as such integrates tightly with current machine learning practice. Applications are widespread. I will begin with an introduction to the theory of neural ordinary differential equations, which may for example be used to model unknown physics. I will then move on to discussing recent work on neural controlled differential equations, which are state-of-the-art models for (arbitrarily irregular) time series. Next will be some discussion of neural stochastic differential equations: we will see that the mathematics of SDEs is precisely aligned with the machine learning of GANs, and thus NSDEs may be used as generative models. If time allows I will then discuss other recent work, such as how the training of neural differential equations may be sped up by ~40% by tweaking standard numerical solvers to respect the particular nature of the differential equations. This is joint work with Ricky T. Q. Chen, Xuechen Li, James Foster, and James Morrill.

Thu, 11 Dec 2014
16:00
L1

The Story of Equations

Andrew Wiles
(Oxford Mathematics)
Abstract

We are pleased to announce that Andrew Wiles will present the inaugural Oxford Mathematics Christmas Public Lecture. Please register by emailing @email

 

Subscribe to Oxford Mathematics