Date
Mon, 16 Oct 2023
15:30
Location
Lecture Theatre 3, Mathematical Institute, Radcliffe Observatory Quarter, Woodstock Road, OX2 6GG
Speaker
Dr Maud Lemercier
Organisation
Mathematical Institute (University of Oxford)

Neural SDEs are continuous-time generative models for sequential data. State-of-the-art performance for irregular time series generation has been previously obtained by training these models adversarially as GANs. However, as typical for GAN architectures, training is notoriously unstable, often suffers from mode collapse, and requires specialised techniques such as weight clipping and gradient penalty to mitigate these issues. In this talk, I will introduce a novel class of scoring rules on path space based on signature kernels and use them as an objective for training Neural SDEs non-adversarially. The strict properness of such kernel scores and the consistency of the corresponding estimators, provide existence and uniqueness guarantees for the minimiser. With this formulation, evaluating the generator-discriminator pair amounts to solving a system of linear path-dependent PDEs which allows for memory-efficient adjoint-based backpropagation. Moreover, because the proposed kernel scores are well-defined for paths with values in infinite-dimensional spaces of functions, this framework can be easily extended to generate spatiotemporal data. This procedure permits conditioning on a rich variety of market conditions and significantly outperforms alternative ways of training Neural SDEs on a variety of tasks including the simulation of rough volatility models, the conditional probabilistic forecasts of real-world forex pairs where the conditioning variable is an observed past trajectory, and the mesh-free generation of limit order book dynamics.

Further Information

Please join us from 1500-1530 for tea and coffee outside the lecture theatre before the talk.

Please contact us with feedback and comments about this page. Last updated on 08 Sep 2023 08:16.