Performance Studies of the Acoustic Module for the IceCube Upgrade
Benning, C Borowka, J Günther, C Gries, O Zierke, S (16 Aug 2023) http://arxiv.org/abs/2308.08506v1
Extending the IceCube search for neutrino point sources in the Northern
sky with additional years of data
Bellenghi, C Minh, M Kontrimas, T Manao, E Ørsøe, R Wolf, M (24 Aug 2023) http://arxiv.org/abs/2308.12742v1
Measurement of the Cosmic Neutrino Flux from the Southern Sky using 10
years of IceCube Starting Track Events
Silva, M Mancina, S Osborn, J (08 Aug 2023) http://arxiv.org/abs/2308.04582v1
A new simulation framework for IceCube Upgrade calibration using IceCube
Upgrade Camera system
Tönnis, C Choi, S Rott, C Seo, M Lee, J (11 Aug 2023) http://arxiv.org/abs/2308.06247v1
Yang-Mills form factors on self-dual backgrounds
Bogna, G Mason, L Journal of High Energy Physics volume 2023 issue 8 165 (24 Aug 2023)
Mon, 16 Oct 2023
15:30
Lecture Theatre 3, Mathematical Institute, Radcliffe Observatory Quarter, Woodstock Road, OX2 6GG

Non-adversarial training of Neural SDEs with signature kernel scores

Dr Maud Lemercier
(Mathematical Institute (University of Oxford))
Further Information

Please join us from 1500-1530 for tea and coffee outside the lecture theatre before the talk.

Abstract

Neural SDEs are continuous-time generative models for sequential data. State-of-the-art performance for irregular time series generation has been previously obtained by training these models adversarially as GANs. However, as typical for GAN architectures, training is notoriously unstable, often suffers from mode collapse, and requires specialised techniques such as weight clipping and gradient penalty to mitigate these issues. In this talk, I will introduce a novel class of scoring rules on path space based on signature kernels and use them as an objective for training Neural SDEs non-adversarially. The strict properness of such kernel scores and the consistency of the corresponding estimators, provide existence and uniqueness guarantees for the minimiser. With this formulation, evaluating the generator-discriminator pair amounts to solving a system of linear path-dependent PDEs which allows for memory-efficient adjoint-based backpropagation. Moreover, because the proposed kernel scores are well-defined for paths with values in infinite-dimensional spaces of functions, this framework can be easily extended to generate spatiotemporal data. This procedure permits conditioning on a rich variety of market conditions and significantly outperforms alternative ways of training Neural SDEs on a variety of tasks including the simulation of rough volatility models, the conditional probabilistic forecasts of real-world forex pairs where the conditioning variable is an observed past trajectory, and the mesh-free generation of limit order book dynamics.

Four point functions in CFT’s with slightly broken higher spin symmetry
Silva, J Journal of High Energy Physics volume 2021 issue 5 97 (12 May 2021)
Nonperturbative Mellin amplitudes: existence, properties, applications
Penedones, J Silva, J Zhiboedov, A Journal of High Energy Physics volume 2020 issue 8 31 (06 Aug 2020)
Subscribe to