Mon, 14 Mar 2022

15:30 - 16:30
L3

TBC

GONCALO DOS REIS
(University of Edinburgh)
Abstract

TBC

Tue, 07 Dec 2021

14:00 - 15:00
Virtual

FFTA: Directed Network Laplacians and Random Graph Models

Xue Gong
(University of Edinburgh)
Abstract

We consider spectral methods that uncover hidden structures in directed networks. We establish and exploit connections between node reordering via (a) minimizing an objective function and (b) maximizing the likelihood of a random graph model. We focus on two existing spectral approaches that build and analyse Laplacian-style matrices via the minimization of frustration and trophic incoherence. These algorithms aim to reveal directed periodic and linear hierarchies, respectively. We show that reordering nodes using the two algorithms, or mapping them onto a specified lattice, is associated with new classes of directed random graph models. Using this random graph setting, we are able to compare the two algorithms on a given network and quantify which structure is more likely to be present. We illustrate the approach on synthetic and real networks, and discuss practical implementation issues. This talk is based on a joint work with Desmond Higham and Konstantinos Zygalakis. 

Article link: https://royalsocietypublishing.org/doi/10.1098/rsos.211144

Mon, 10 May 2021

16:00 - 17:00

 Superdiffusive limits for deterministic fast-slow dynamical systems

ILYA CHEVYREV
(University of Edinburgh)
Abstract

In this talk, we will consider multidimensional fast-slow dynamical systems in discrete-time with random initial conditions but otherwise completely deterministic dynamics. The question we will investigate is whether the slow variable converges in law to a stochastic process under a suitable scaling limit. We will be particularly interested in the case when the limiting dynamic is superdiffusive, i.e. it coincides in law with the solution of a Marcus SDE driven by a discontinuous stable Lévy process. Under certain assumptions, we will show that generically convergence does not hold in any Skorokhod topology but does hold in a generalisation of the Skorokhod strong M1 topology which we define using so-called path functions. Our methods are based on a combination of ergodic theory and ideas arising from (but not using) rough paths. We will finally show that our assumptions are satisfied for a class of intermittent maps of Pomeau-Manneville type. 

 

Tue, 10 Nov 2020

15:30 - 16:30
Virtual

On the joint moments of characteristic polynomials of random unitary matrices

Theo Assiotis
(University of Edinburgh)
Further Information

This seminar will be held via zoom. Meeting link will be sent to members of our mailing list (https://lists.maths.ox.ac.uk/mailman/listinfo/random-matrix-theory-anno…) in our weekly announcement on Monday.

Abstract

I will talk about the joint moments of characteristic polynomials of random unitary matrices and their derivatives. In joint work with Jon Keating and Jon Warren we establish the asymptotics of these quantities for general real values of the exponents as the size N of the matrix goes to infinity. This proves a conjecture of Hughes from 2001. In subsequent joint work with Benjamin Bedert, Mustafa Alper Gunes and Arun Soor we focus on the leading order coefficient in the asymptotics, we connect this to Painleve equations for general values of the exponents and obtain explicit expressions corresponding to the so-called classical solutions of these equations.

Mon, 09 Dec 2019

15:45 - 16:45
L3

Ito-Wentzell-Lions formula for measure dependent random fields under full and conditional measure flows

GONCALO DOS REIS
(University of Edinburgh)
Abstract


We present several Itô-Wentzell formulae on Wiener spaces for real-valued functionals random field of Itô type depending on measures. We distinguish the full- and marginal-measure flow cases. Derivatives with respect to the measure components are understood in the sense of Lions.
This talk is based on joint work with V. Platonov (U. of Edinburgh), see https://arxiv.org/abs/1910.01892.
 

Fri, 31 May 2019

12:00 - 13:00
L4

A Nonlinear Spectral Method for Network Core-Periphery Detection

Desmond Higham
(University of Edinburgh)
Abstract

Dimension reduction is an overarching theme in data science: we enjoy finding informative patterns, features or substructures in large, complex data sets. Within the field of network science, an important problem of this nature is to identify core-periphery structure. Given a network, our task is to assign each node to either the core or periphery. Core nodes should be strongly connected across the whole network whereas peripheral nodes should be strongly connected only to core nodes. More generally, we may wish to assign a non-negative value to each node, with a larger value indicating greater "coreness." This type of problem is related to, but distinct from, commumnity detection (finding clusters) and centrality assignment (finding key players), and it arises naturally in the study of networks in social science and finance. We derive and analyse a new iterative algorithm for detecting network core-periphery structure.

Using techniques in nonlinear Perron-Frobenius theory we prove global convergence to the unique solution of a relaxed version of a natural discrete optimization problem. On sparse networks, the cost of each iteration scales linearly with the number of nodes, making the algorithm feasible for large-scale problems. We give an alternative interpretation of the algorithm from the perspective of maximum likelihood reordering of a new logistic core--periphery random graph model. This viewpoint also gives a new basis for quantitatively judging a core--periphery detection algorithm. We illustrate the algorithm on a range of synthetic and real networks, and show that it offers advantages over the current state-of-the-art.

This is joint work with Francesco Tudisco (Strathclyde)

Mon, 03 Jun 2019

14:15 - 15:15
L3

Mean Field Langevin Dynamics and Its Applications to Neural Networks

DAVID SISKA
(University of Edinburgh)
Abstract

 

Neural networks are undoubtedly successful in practical applications. However complete mathematical theory of why and when machine learning algorithms based on neural networks work has been elusive. Although various representation theorems ensures the existence of the ``perfect’’ parameters of the network, it has not been proved that these perfect parameters can be (efficiently) approximated by conventional algorithms, such as the stochastic gradient descent. This problem is well known, since the arising optimisation problem is non-convex. In this talk we show how the optimization problem becomes convex in the mean field limit for one-hidden layer networks and certain deep neural networks. Moreover we present optimality criteria for the distribution of the network parameters and show that the nonlinear Langevin dynamics converges to this optimal distribution. This is joint work with Kaitong Hu, Zhenjie Ren and Lukasz Szpruch. 

 

Fri, 10 May 2019

14:00 - 15:30
L6

Scattering of inertia-gravity waves in geostrophic turbulence

Prof. Jacques Vanneste
(University of Edinburgh)
Abstract

Inertia-gravity waves (IGWs) are ubiquitous in the ocean and the atmosphere. Once generated (by tides, topography, convection and other processes), they propagate and scatter in the large-scale, geostrophically-balanced background flow. I will discuss models of this scattering which represent the background flow as a random field with known statistics. Without assumption of spatial scale separation between waves and flow, the scattering is described by a kinetic equation involving a scattering cross section determined by the energy spectrum of the flow. In the limit of small-scale waves, this equation reduces to a diffusion equation in wavenumber space. This predicts, in particular, IGW energy spectra scaling as k^{-2}, consistent with observations in the atmosphere and ocean, lending some support to recent claims that (sub)mesoscale spectra can be attributed to almost linear IGWs.  The theoretical predictions are checked against numerical simulations of the three-dimensional Boussinesq equations.
(Joint work with Miles Savva and Hossein Kafiabad.)

Fri, 01 Mar 2019

12:00 - 13:00
L4

Modular, Infinite, and Other Deep Generative Models of Data

Charles Sutton
(University of Edinburgh)
Abstract

Deep generative models provide powerful tools for fitting difficult distributions such as modelling natural images. But many of these methods, including  variational autoencoders (VAEs) and generative adversarial networks (GANs), can be notoriously difficult to fit.

One well-known problem is mode collapse, which means that models can learn to characterize only a few modes of the true distribution. To address this, we introduce VEEGAN, which features a reconstructor network, reversing the action of the generator by mapping from data to noise. Our training objective retains the original asymptotic consistency guarantee of GANs, and can be interpreted as a novel autoencoder loss over the noise.

Second, maximum mean discrepancy networks (MMD-nets) avoid some of the pathologies of GANs, but have not been able to match their performance. We present a new method of training MMD-nets, based on mapping the data into a lower dimensional space, in which MMD training can be more effective. We call these networks Ratio-based MMD Nets, and show that somewhat mysteriously, they have dramatically better performance than MMD nets.

A final problem is deciding how many latent components are necessary for a deep generative model to fit a certain data set. We present a nonparametric Bayesian approach to this problem, based on defining a (potentially) infinitely wide deep generative model. Fitting this model is possible by combining variational inference with a Monte Carlo method from statistical physics called Russian roulette sampling. Perhaps surprisingly, we find that this modification helps with the mode collapse problem as well.

 

Thu, 07 Jun 2018

16:00 - 17:30
L4

Large Deviations for McKean Vlasov Equations and Importance Sampling

Goncalo dos Reis
(University of Edinburgh)
Abstract


We discuss two Freidlin-Wentzell large deviation principles for McKean-Vlasov equations (MV-SDEs) in certain path space topologies. The equations have a drift of polynomial growth and an existence/uniqueness result is provided. We apply the Monte-Carlo methods for evaluating expectations of functionals of solutions to MV-SDE with drifts of super-linear growth.  We assume that the MV-SDE is approximated in the standard manner by means of an interacting particle system and propose two importance sampling (IS) techniques to reduce the variance of the resulting Monte Carlo estimator. In the "complete measure change" approach, the IS measure change is applied simultaneously in the coefficients and in the expectation to be evaluated. In the "decoupling" approach we first estimate the law of the solution in a first set of simulations without measure change and then perform a second set of simulations under the importance sampling measure using the approximate solution law computed in the first step. 

Subscribe to University of Edinburgh