Tue, 19 Feb 2019

14:30 - 15:30

The generalised Oberwolfach problem

Katherine Staden
Further Information

Recently, much progress has been made on the general problem of decomposing a dense (usually complete) graph into a given family of sparse graphs (e.g. Hamilton cycles or trees). I will present a new result of this type: that any quasirandom dense large graph in which all degrees are equal and even can be decomposed into any given collection of two-factors (2-regular spanning subgraphs). A special case of this result reproves the Oberwolfach problem for large graphs.

 

This is joint work with Peter Keevash.

Tue, 05 Mar 2019

14:30 - 15:00
L5

MLQMC Methods for Elliptic PDEs Driven by White Noise

Matteo Croci
(Oxford)
Abstract

When solving partial differential equations driven by additive spatial white noise, the efficient sampling of white noise realizations can be challenging. In this talk we focus on the efficient sampling of white noise using quasi-random points in a finite element method and multilevel Quasi Monte Carlo (MLQMC) setting. This work is an extension of previous research on white noise sampling for MLMC.

We express white noise as a wavelet series expansion that we divide in two parts. The first part is sampled using quasi-random points and contains a finite number of terms in order of decaying importance to ensure good QMC convergence. The second part is a correction term which is sampled using standard pseudo-random numbers.

We show how the sampling of both terms can be performed in linear time and memory complexity in the number of mesh cells via a supermesh construction. Furthermore, our technique can be used to enforce the MLQMC coupling even in the case of non-nested mesh hierarchies. We demonstrate the efficacy of our method with numerical experiments.

Tue, 19 Feb 2019

12:45 - 13:30
C3

Model of a cycling coexistence of viral strains and a survival of the specialist

Anel Nurtay
Abstract

With growing population of humans being clustered in large cities and connected by fast routes more suitable environments for epidemics are being created. Topped by rapid mutation rate of viral and bacterial strains, epidemiological studies stay a relevant topic at all times. From the beginning of 2019, the World Health Organization publishes at least five disease outbreak news including Ebola virus disease, dengue fever and drug resistant gonococcal infection, the latter is registered in the United Kingdom.

To control the outbreaks it is necessary to gain information on mechanisms of appearance and evolution of pathogens. Close to all disease-causing virus and bacteria undergo a specialization towards a human host from the closest livestock or wild fauna of a shared habitat. Every strain (or subtype) of a pathogen has a set of characteristics (e.g. infection rate and burst size) responsible for its success in a new environment, a host cell in case of a virus, and with the right amount of skepticism that set can be framed as fitness of the pathogen. In our model, we consider a population of a mutating strain of a virus. The strain specialized towards a new host usually remains in the environment and does not switch until conditions get volatile. Two subtypes, wild and mutant, of the virus share a host. This talk will illustrate findings on an explicitly independent cycling coexistence of the two subtypes of the parasite population. A rare transcritical bifurcation of limit cycles is discussed. Moreover, we will find conditions when one of the strains can outnumber and eventually eliminate the other strain focusing on an infection rate as fitness of strains.

Fri, 24 May 2019

14:00 - 15:30
L6

Diabatic vortices: a simple model of tropical cyclones and the martian polar vortex

Prof. Richard Scott
(University of St Andrews)
Abstract

In this talk, we will consider how two very different atmospheric phenomena, the terrestrial tropical cyclone and the martian polar vortex, can be described within a single simplified dynamical framework based on the forced shallow water equations. Dynamical forcings include angular momentum transport by secondary (transverse) circulations and local heating due to latent heat release. The forcings act in very different ways in the two systems but in both cases lead to distinct annular distributions of potential vorticity, with a local vorticity maximum at a finite radius surrounding a central minimum.  In both systems, the resulting vorticity distributions are subject to shear instability and the degree of eddy growth versus annular persistence can be examined explicitly under different forcing scenarios.

Fri, 10 May 2019

14:00 - 15:30
L6

Scattering of inertia-gravity waves in geostrophic turbulence

Prof. Jacques Vanneste
(University of Edinburgh)
Abstract

Inertia-gravity waves (IGWs) are ubiquitous in the ocean and the atmosphere. Once generated (by tides, topography, convection and other processes), they propagate and scatter in the large-scale, geostrophically-balanced background flow. I will discuss models of this scattering which represent the background flow as a random field with known statistics. Without assumption of spatial scale separation between waves and flow, the scattering is described by a kinetic equation involving a scattering cross section determined by the energy spectrum of the flow. In the limit of small-scale waves, this equation reduces to a diffusion equation in wavenumber space. This predicts, in particular, IGW energy spectra scaling as k^{-2}, consistent with observations in the atmosphere and ocean, lending some support to recent claims that (sub)mesoscale spectra can be attributed to almost linear IGWs.  The theoretical predictions are checked against numerical simulations of the three-dimensional Boussinesq equations.
(Joint work with Miles Savva and Hossein Kafiabad.)

It's Valentine's Day this Thursday (14th February in case you've forgotten) and Love AND Maths are in the air. For the first time, at 10am Oxford Mathematics will be LIVE STREAMING a 1st Year undergraduate lecture. In addition we will film (not live) a real tutorial based on that lecture.

The details:
LIVE Oxford Mathematics Student Lecture - James Sparks: 1st Year Undergraduate lecture on 'Dynamics', the mathematics of how things change with time
14th February, 10am-11am UK time

Fri, 08 Mar 2019

12:00 - 13:00
L4

Programmatically Structured Representations for Robust Autonomy in Robots

Subramanian Ramamoorthy
(University of Edinburgh and FiveAI)
Abstract


A defining feature of robotics today is the use of learning and autonomy in the inner loop of systems that are actually being deployed in the real world, e.g., in autonomous driving or medical robotics. While it is clear that useful autonomous systems must learn to cope with a dynamic environment, requiring architectures that address the richness of the worlds in which such robots must operate, it is also equally clear that ensuring the safety of such systems is the single biggest obstacle preventing scaling up of these solutions. I will discuss an approach to system design that aims at addressing this problem by incorporating programmatic structure in the network architectures being used for policy learning. I will discuss results from two projects in this direction.

Firstly, I will present the perceptor gradients algorithm – a novel approach to learning symbolic representations based on the idea of decomposing an agent’s policy into i) a perceptor network extracting symbols from raw observation data and ii) a task encoding program which maps the input symbols to output actions. We show that the proposed algorithm is able to learn representations that can be directly fed into a Linear-Quadratic Regulator (LQR) or a general purpose A* planner. Our experimental results confirm that the perceptor gradients algorithm is able to efficiently learn transferable symbolic representations as well as generate new observations according to a semantically meaningful specification.

Next, I will describe work on learning from demonstration where the task representation is that of hybrid control systems, with emphasis on extracting models that are explicitly verifi able and easily interpreted by robot operators. Through an architecture that goes from the sensorimotor level involving fitting a sequence of controllers using sequential importance sampling under a generative switching proportional controller task model, to higher level modules that are able to induce a program for a visuomotor reaching task involving loops and conditionals from a single demonstration, we show how a robot can learn tasks such as tower building in a manner that is interpretable and eventually verifiable.

 

References:

1. S.V. Penkov, S. Ramamoorthy, Learning programmatically structured representations with preceptor gradients, In Proc. International Conference on Learning Representations (ICLR), 2019. http://rad.inf.ed.ac.uk/data/publications/2019/penkov2019learning.pdf

2. M. Burke, S.V. Penkov, S. Ramamoorthy, From explanation to synthesis: Compositional program induction for learning from demonstration, https://arxiv.org/abs/1902.10657
 

Fri, 01 Mar 2019

12:00 - 13:00
L4

Modular, Infinite, and Other Deep Generative Models of Data

Charles Sutton
(University of Edinburgh)
Abstract

Deep generative models provide powerful tools for fitting difficult distributions such as modelling natural images. But many of these methods, including  variational autoencoders (VAEs) and generative adversarial networks (GANs), can be notoriously difficult to fit.

One well-known problem is mode collapse, which means that models can learn to characterize only a few modes of the true distribution. To address this, we introduce VEEGAN, which features a reconstructor network, reversing the action of the generator by mapping from data to noise. Our training objective retains the original asymptotic consistency guarantee of GANs, and can be interpreted as a novel autoencoder loss over the noise.

Second, maximum mean discrepancy networks (MMD-nets) avoid some of the pathologies of GANs, but have not been able to match their performance. We present a new method of training MMD-nets, based on mapping the data into a lower dimensional space, in which MMD training can be more effective. We call these networks Ratio-based MMD Nets, and show that somewhat mysteriously, they have dramatically better performance than MMD nets.

A final problem is deciding how many latent components are necessary for a deep generative model to fit a certain data set. We present a nonparametric Bayesian approach to this problem, based on defining a (potentially) infinitely wide deep generative model. Fitting this model is possible by combining variational inference with a Monte Carlo method from statistical physics called Russian roulette sampling. Perhaps surprisingly, we find that this modification helps with the mode collapse problem as well.

 

Subscribe to