Seminar series
Date
Fri, 30 Oct 2020
Time
12:00 - 13:00
Speaker
Patrick Kidger
Organisation
Oxford Mathematics

Differential equations and neural networks are two of the most widespread modelling paradigms. I will talk about how to combine the best of both worlds through neural differential equations. These treat differential equations as a learnt component of a differentiable computation graph, and as such integrates tightly with current machine learning practice. Applications are widespread. I will begin with an introduction to the theory of neural ordinary differential equations, which may for example be used to model unknown physics. I will then move on to discussing recent work on neural controlled differential equations, which are state-of-the-art models for (arbitrarily irregular) time series. Next will be some discussion of neural stochastic differential equations: we will see that the mathematics of SDEs is precisely aligned with the machine learning of GANs, and thus NSDEs may be used as generative models. If time allows I will then discuss other recent work, such as how the training of neural differential equations may be sped up by ~40% by tweaking standard numerical solvers to respect the particular nature of the differential equations. This is joint work with Ricky T. Q. Chen, Xuechen Li, James Foster, and James Morrill.

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.