Thu, 25 Nov 2021
14:00
Virtual

Adaptive multilevel delayed acceptance

Tim Dodwell
(University of Exeter)
Abstract

Uncertainty Quantification through Markov Chain Monte Carlo (MCMC) can be prohibitively expensive for target probability densities with expensive likelihood functions, for instance when the evaluation it involves solving a Partial Differential Equation (PDE), as is the case in a wide range of engineering applications. Multilevel Delayed Acceptance (MLDA) with an Adaptive Error Model (AEM) is a novel approach, which alleviates this problem by exploiting a hierarchy of models, with increasing complexity and cost, and correcting the inexpensive models on-the-fly. The method has been integrated within the open-source probabilistic programming package PyMC3 and is available in the latest development version.

In this talk I will talk about the problems with the Multilevel Markov Chain Monte Carlo (Dodwell et al. 2015). In so we will prove detailed balance for Adaptive Multilevel Delayed Acceptance, as well as showing that multilevel variance reduction can be achieved without bias, not possible in the original MLMCMC framework.

I will talk about our implementation in the latest version of pymc3, and demonstrate how for classical inverse problem benchmarks the AMLDA sampler offers huge computational savings (> factor of 100 fold speed up).

Finally I will talk heuristically about new / future research, in which we seek to develop parallel strategies for this inherently sequential sampler, as well as point to interesting applied application areas in which the method is proving particular effective.

 

--

This talk will be in person.

Thu, 21 Oct 2021
14:00
Virtual

Randomized Methods for Sublinear Time Low-Rank Matrix Approximation

Cameron Musco
(University of Massachusetts)
Abstract

I will discuss recent advances in sampling methods for positive semidefinite (PSD) matrix approximation. In particular, I will show how new techniques based on recursive leverage score sampling yield a surprising algorithmic result: we give a method for computing a near optimal k-rank approximation to any n x n PSD matrix in O(n * k^2) time. When k is not too large, our algorithm runs in sublinear time -- i.e. it does not need to read all entries of the matrix. This result illustrates the ability of randomized methods to exploit the structure of PSD matrices and go well beyond what is possible with traditional algorithmic techniques. I will discuss a number of current research directions and open questions, focused on applications of randomized methods to sublinear time algorithms for structured matrix problems.

--

A link for this talk will be sent to our mailing list a day or two in advance.  If you are not on the list and wish to be sent a link, please contact @email.

Thu, 14 Oct 2021
14:00
Virtual

What is the role of a neuron?

David Bau
(MIT)
Abstract

 

One of the great challenges of neural networks is to understand how they work.  For example: does a neuron encode a meaningful signal on its own?  Or is a neuron simply an undistinguished and arbitrary component of a feature vector space?  The tension between the neuron doctrine and the population coding hypothesis is one of the classical debates in neuroscience. It is a difficult debate to settle without an ability to monitor every individual neuron in the brain.

 

Within artificial neural networks we can examine every neuron. Beginning with the simple proposal that an individual neuron might represent one internal concept, we conduct studies relating deep network neurons to human-understandable concepts in a concrete, quantitative way: Which neurons? Which concepts? Are neurons more meaningful than an arbitrary feature basis? Do neurons play a causal role? We examine both simplified settings and state-of-the-art networks in which neurons learn how to represent meaningful objects within the data without explicit supervision.

 

Following this inquiry in computer vision leads us to insights about the computational structure of practical deep networks that enable several new applications, including semantic manipulation of objects in an image; understanding of the sparse logic of a classifier; and quick, selective editing of generalizable rules within a fully trained generative network.  It also presents an unanswered mathematical question: why is such disentanglement so pervasive?

 

In the talk, we challenge the notion that the internal calculations of a neural network must be hopelessly opaque. Instead, we propose to tear back the curtain and chart a path through the detailed structure of a deep network by which we can begin to understand its logic.

--

A link for this talk will be sent to our mailing list a day or two in advance.  If you are not on the list and wish to be sent a link, please contact @email.

Wed, 22 Sep 2021

09:00 - 10:00
Virtual

Stochastic Flows and Rough Differential Equations on Foliated Spaces

Yuzuru Inahama
(Kyushu University)
Further Information
Abstract

Stochastic differential equations (SDEs) on compact foliated spaces were introduced a few years ago. As a corollary, a leafwise Brownian motion on a compact foliated space was obtained as a solution to an SDE. In this work we construct stochastic flows associated with the SDEs by using rough path theory, which is something like a 'deterministic version' of Ito's SDE theory.

This is joint work with Kiyotaka Suzaki.

Tue, 29 Jun 2021
14:00
Virtual

Asymptotics for the wave equation on black hole spacetimes

Stefanos Aretakis
(Toronto)
Abstract

We will present the precise late-time asymptotics for scalar fields on both extremal and sub-extremal black holes including the full Reissner-Nordstrom family and the subextremal Kerr family. Asymptotics for higher angular modes will be presented for all cases. Applications in observational signatures will also be discussed. This work is joint with Y. Angelopoulos (Caltech) and D. Gajic (Cambridge)

Mon, 28 Jun 2021
11:30
Virtual

Feynman integrals from the viewpoint of Picard-Lefschetz theory

Marko Berghoff
(Oxford)
Abstract

I will present work in progress with Erik Panzer, Matteo Parisi and Ömer Gürdoğan on the analytic structure of Feynman(esque) integrals: We consider integrals of meromorphic differential forms over relative cycles in a compact complex manifold, the underlying geometry encoded in a certain (parameter dependant) subspace arrangement (e.g. Feynman integrals in their parametric representation). I will explain how the analytic struture of such integrals can be studied via methods from differential topology; this is the seminal work by Pham et al (using tools and methods developed by Leray, Thom, Picard-Lefschetz etc.). Although their work covers a very general setup, the case we need for Feynman integrals has never been worked out in full detail. I will comment on the gaps that have to be filled to make the theory work, then discuss how much information about the analytic structure of integrals can be derived from a careful study of the corresponding subspace arrangement.

Mon, 21 Jun 2021

16:00 - 17:00
Virtual

Correlations of almost primes

Natalie Evans
(KCL)
Abstract

The Hardy-Littlewood generalised twin prime conjecture states an asymptotic formula for the number of primes $p\le X$ such that $p+h$ is prime for any non-zero even integer $h$. While this conjecture remains wide open, Matom\"{a}ki, Radziwi{\l}{\l} and Tao proved that it holds on average over $h$, improving on a previous result of Mikawa. In this talk we will discuss an almost prime analogue of the Hardy-Littlewood conjecture for which we can go beyond what is known for primes. We will describe some recent work in which we prove an asymptotic formula for the number of almost primes $n=p_1p_2 \le X$ such that $n+h$ has exactly two prime factors which holds for a very short average over $h$.

Fri, 18 Jun 2021

14:00 - 15:00
Virtual

Jacobson's Commutativity Problem

Mike Daas
(Leiden University)
Abstract

It is a well-known fact that Boolean rings, those rings in which $x^2 = x$ for all $x$, are necessarily commutative. There is a short and completely elementary proof of this. One may wonder what the situation is for rings in which $x^n = x$ for all $x$, where $n > 2$ is some positive integer. Jacobson and Herstein proved a very general theorem regarding these rings, and the proof follows a widely applicable strategy that can often be used to reduce questions about general rings to more manageable ones. We discuss this strategy, but will also focus on a different approach: can we also find ''elementary'' proofs of some special cases of the theorem? We treat a number of these explicit computations, among which a few new results.

Wed, 08 Sep 2021

09:00 - 10:00
Virtual

Co-clustering Analysis of Multidimensional Big Data

Hong Yan
(City University of Hong Kong)
Further Information
Abstract

Although a multidimensional data array can be very large, it may contain coherence patterns much smaller in size. For example, we may need to detect a subset of genes that co-express under a subset of conditions. In this presentation, we discuss our recently developed co-clustering algorithms for the extraction and analysis of coherent patterns in big datasets. In our method, a co-cluster, corresponding to a coherent pattern, is represented as a low-rank tensor and it can be detected from the intersection of hyperplanes in a high dimensional data space. Our method has been used successfully for DNA and protein data analysis, disease diagnosis, drug therapeutic effect assessment, and feature selection in human facial expression classification. Our method can also be useful for many other real-world data mining, image processing and pattern recognition applications.

Thu, 17 Jun 2021
11:30
Virtual

Compressible types in NIP theories

Itay Kaplan
(The Hebrew University of Jerusalem)
Abstract

I will discuss compressible types and relate them to uniform definability of types over finite sets (UDTFS), to uniformity of honest definitions and to the construction of compressible models in the context of (local) NIP. All notions will be defined during the talk.
Joint with Martin Bays and Pierre Simon.

Subscribe to Virtual