Author
Kidger, P
Foster, J
Li, X
Oberhauser, H
Lyons, T
Last updated
2024-04-25T12:53:58.787+01:00
Page
5453-5463
Abstract
Stochastic differential equations (SDEs) are a staple of mathematical modelling of temporal dynamics. However, a fundamental limitation has been that such models have typically been relatively inflexible, which recent work introducing Neural SDEs has sought to solve. Here, we show that the current classical approach to fitting SDEs may be approached as a special case of (Wasserstein) GANs, and in doing so the neural and classical regimes may be brought together. The input noise is Brownian motion, the output samples are time-evolving paths produced by a numerical solver, and by parameterising a discriminator as a Neural Controlled Differential Equation (CDE), we obtain Neural SDEs as (in modern machine learning parlance) continuous-time generative time series models. Unlike previous work on this problem, this is a direct extension of the classical approach without reference to either prespecified statistics or density functions. Arbitrary drift and diffusions are admissible, so as the Wasserstein loss has a unique global minima, in the infinite data limit \textit{any} SDE may be learnt.
Symplectic ID
1161821
Favourite
Off
Publication type
Conference Paper
Publication date
01 Jul 2021
Please contact us with feedback and comments about this page. Created on 16 Feb 2021 - 20:48.