Fri, 12 Mar 2021

12:00 - 13:00

The Metric is All You Need (for Disentangling)

David Pfau
(DeepMind)
Abstract

Learning a representation from data that disentangles different factors of variation is hypothesized to be a critical ingredient for unsupervised learning. Defining disentangling is challenging - a "symmetry-based" definition was provided by Higgins et al. (2018), but no prescription was given for how to learn such a representation. We present a novel nonparametric algorithm, the Geometric Manifold Component Estimator (GEOMANCER), which partially answers the question of how to implement symmetry-based disentangling. We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known and each factor manifold has nontrivial holonomy – for example, rotation in 3D. Our algorithm works by estimating the subspaces that are invariant under random walk diffusion, giving an approximation to the de Rham decomposition from differential geometry. We demonstrate the efficacy of GEOMANCER on several complex synthetic manifolds. Our work reduces the question of whether unsupervised disentangling is possible to the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.

 

Fri, 05 Mar 2021

12:00 - 13:00

Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations

Ke Ye
(Chinese Academy of Sciences)
Abstract

Low rank orthogonal tensor approximation (LROTA) is an important problem in tensor computations and their applications. A classical and widely used algorithm is the alternating polar decomposition method (APD). In this talk, I will first give very a brief introduction to tensors and their decompositions. After that, an improved version named iAPD of the classical APD will be proposed and all the following four fundamental properties of iAPD will be discussed : (i) the algorithm converges globally and the whole sequence converges to a KKT point without any assumption; (ii) it exhibits an overall sublinear convergence with an explicit rate which is sharper than the usual O(1/k) for first order methods in optimization; (iii) more importantly, it converges R-linearly for a generic tensor without any assumption; (iv) for almost all LROTA problems, iAPD reduces to APD after finitely many iterations if it converges to a local minimizer. If time permits, I will also present some numerical experiments.

Fri, 26 Feb 2021

12:00 - 13:00

The magnitude of point-cloud data (cancelled)

Nina Otter
(UCLA)
Abstract

Magnitude is an isometric invariant of metric spaces that was introduced by Tom Leinster in 2010, and is currently the object of intense research, since it has been shown to encode many invariants of a metric space such as volume, dimension, and capacity.

Magnitude homology is a homology theory for metric spaces that has been introduced by Hepworth-Willerton and Leinster-Shulman, and categorifies magnitude in a similar way as the singular homology of a topological space categorifies its Euler characteristic.

In this talk I will first introduce magnitude and magnitude homology. I will then give an overview of existing results and current research in this area, explain how magnitude homology is related to persistent homology, and finally discuss new stability results for magnitude and how it can be used to study point cloud data.

This talk is based on  joint work in progress with Miguel O’Malley and Sara Kalisnik, as well as the preprint https://arxiv.org/abs/1807.01540.

Applications are now open for the University’s 2021 graduate access programmes: UNIQ+, & UNIQ+Digital.

Our graduate access programmes, open to all students in the UK, are designed to encourage and support talented undergraduates who would find continuing into postgraduate study a challenge for reasons other than their academic ability. 

Enabling mathematical cultures: introduction
Löwe, B Martin, U Pease, A Synthese volume 198 issue Suppl 26 6225-6231 (21 Nov 2021)
Mon, 22 Feb 2021

16:00 - 17:00

 Non-equilibrium fluctuations in interacting particle systems and conservative stochastic PDE

BENJAMIN FEHRMAN
(Oxford University)
Abstract

 

Interacting particle systems have found diverse applications in mathematics and several related fields, including statistical physics, population dynamics, and machine learning.  We will focus, in particular, on the zero range process and the symmetric simple exclusion process.  The large-scale behavior of these systems is essentially deterministic, and is described in terms of a hydrodynamic limit.  However, the particle process does exhibit large fluctuations away from its mean.  Such deviations, though rare, can have significant consequences---such as a concentration of energy or the appearance of a vacuum---which make them important to understand and simulate.

In this talk, which is based on joint work with Benjamin Gess, I will introduce a continuum model for simulating rare events in the zero range and symmetric simple exclusion process.  The model is based on an approximating sequence of stochastic partial differential equations with nonlinear, conservative noise.  The solutions capture to first-order the central limit fluctuations of the particle system, and they correctly simulate rare events in terms of a large deviations principle.

Wed, 28 Apr 2021

10:00 - 11:30
Virtual

Introduction to SPDEs from Probability and PDE - Lecture 4 of 4

Dr. Avi Mayorcas
(Former University of Oxford D. Phil. Student)
Further Information

Structure: 4 x 1.5hr Lectures 

Lecture 4: Further Topics and Directions (time permitting)

  • Regularity of solutions
  • Ergodicity
  • Pathwise approach to SPDE

 

Abstract

The course will aim to provide an introduction to stochastic PDEs from the classical perspective, that being a mixture of stochastic analysis and PDE analysis. We will focus in particular on the variational approach to semi-linear parabolic problems, `a  la  Lions. There will also be comments on  other models and approaches.

  Suggested Pre-requisites: Suitable for OxPDE students, but also of interests to functional analysts, geometers, probabilists, numerical analysts and anyone who has a suitable level of prerequisite knowledge.

Tue, 27 Apr 2021

10:00 - 11:30
Virtual

Introduction to SPDEs from Probability and PDE - Lecture 3 of 4

Dr. Avi Mayorcas
(Former University of Oxford D. Phil. Student)
Further Information

Structure: 4 x 1.5hr Lectures 

Lecture 3: Variational Approach to Parabolic SPDE

  • Itˆo’s formula in Hilbert spaces
  • Variational approach to monotone, coercive SPDE
  • Concrete examples
Abstract

The course will aim to provide an introduction to stochastic PDEs from the classical perspective, that being a mixture of stochastic analysis and PDE analysis. We will focus in particular on the variational approach to semi-linear parabolic problems, `a  la  Lions. There will also be comments on  other models and approaches.

  Suggested Pre-requisites: The course is broadly aimed at graduate students with some knowledge of PDE theory and/or stochastic  analysis. Familiarity with measure theory and functional analysis will be useful.

Wed, 21 Apr 2021

10:00 - 11:30
Virtual

Introduction to SPDEs from Probability and PDE - Lecture 2 of 4

Dr. Avi Mayorcas
(Former University of Oxford D. Phil. Student)
Further Information

Structure: 4 x 1.5hr Lectures 

Lecture 2: Variational Approach to Deterministic PDE

  • Variational approach to linear parabolic equations
  • Variational approaches to non-linear parabolic equations
Abstract

The course will aim to provide an introduction to stochastic PDEs from the classical perspective, that being a mixture of stochastic analysis and PDE analysis. We will focus in particular on the variational approach to semi-linear parabolic problems, `a  la  Lions. There will also be comments on  other models and approaches.

  Suggested Pre-requisites: The course is broadly aimed at graduate students with some knowledge of PDE theory and/or stochastic  analysis. Familiarity with measure theory and functional analysis will be useful.

Subscribe to