Thu, 01 Dec 2022
16:00
Virtual

Particle filters for Data Assimilation

Dan Crisan
(Imperial College London)

Note: we would recommend to join the meeting using the Teams client for best user experience.

Further Information
Abstract

Modern Data Assimilation (DA) can be traced back to the sixties and owes a lot to earlier developments in linear filtering theory. Since then, DA has evolved independently of Filtering Theory. To-date it is a massively important area of research due to its many applications in meteorology, ocean prediction, hydrology, oil reservoir exploration, etc. The field has been largely driven by practitioners, however in recent years an increasing body of theoretical work has been devoted to it. In this talk, In my talk, I will advocate the interpretation of DA through the language of stochastic filtering. This interpretation allows us to make use of advanced particle filters to produce rigorously validated DA methodologies. I will present a particle filter that incorporates three additional add-on procedures: nudging, tempering and jittering. The particle filter is tested on a two-layer quasi-geostrophic model with O(10^6) degrees of freedom out of which only a minute fraction are noisily observed.

Science Futures is a new area in the Green Futures field at Glastonbury Festival that is dedicated to science. They welcome exciting interactive stalls covering any relevant topic from any discipline in the natural or social sciences. 

All you need to do to apply is register as a “trader” and fill in the online application form.

Applications are light touch, but stalls should fit with the spirit of Science Futures and Glastonbury Festival, with a strong emphasis on visual aspect.

A new mixed finite-element method for H2 elliptic problems
Farrell, P Hamdan, A MacLachlan, S Computers and Mathematics with Applications volume 128 300-319 (15 Dec 2022)

The Oxford University Innovation (OUI) hot desk is open to all researchers, staff and students, currently by phone, email and video call. If anyone has
questions about intellectual property, patenting, creation of new ventures (spinouts, startups or social enterprises), academic consultancy etc., please
contact Paul Gass (@email). No question is too small.

Today:
Illustrating Mathematics - Joshua Bull and Christoph Dorn

Next week
Managing your supervisor - Eva Antonopoulou

Full details

Analysis and Modeling of Client Order Flow in Limit Order Markets
CONT, R CUCURINGU, M Glukhov, V Prenzel, F Quantitative Finance
Theoretical study of the emergence of periodic solutions for the inhibitory NNLIF neuron model with synaptic delay
Ikeda, K Roux, P Salort, D Smets, D Mathematical Neuroscience and Applications volume 2 (26 Oct 2022)
Tue, 01 Nov 2022

12:30 - 13:00
C3

Asymptotic Analysis of Deep Residual Networks

Alain Rossier
Abstract

Residual networks (ResNets) have displayed impressive results in pattern recognition and, recently, have garnered considerable theoretical interest due to a perceived link with neural ordinary differential equations (neural ODEs). This link relies on the convergence of network weights to a smooth function as the number of layers increases. We investigate the properties of weights trained by stochastic gradient descent and their scaling with network depth through detailed numerical experiments. We observe the existence of scaling regimes markedly different from those assumed in neural ODE literature. Depending on certain features of the network architecture, such as the smoothness of the activation function, one may obtain an alternative ODE limit, a stochastic differential equation (SDE) or neither of these. Furthermore, we are able to formally prove the linear convergence of gradient descent to a global optimum for the training of deep residual networks with constant layer width and smooth activation function. We further prove that if the trained weights, as a function of the layer index, admit a scaling limit as the depth increases, then the limit has finite 2-variation.

Subscribe to