Thu, 04 Jun 2020

16:45 - 17:30
Virtual

Cuntz semigroups

Hannes Thiel
(University of Münster)
Further Information

Part of the UK virtual operator algebras seminar: https://sites.google.com/view/uk-operator-algebras-seminar/home

Abstract

The Cuntz semigroup is a geometric refinement of K-theory that plays an important role in the structure theory of C*-algebras. It is defined analogously to the Murray-von Neumann semigroup by using equivalence classes of positive elements instead of projections.
Starting with the definition of the Cuntz semigroup of a C*-algebra, we will look at some of its classical applications. I will then talk about the recent breakthroughs in the structure theory of Cuntz semigroups and some of the consequences.

Thu, 04 Jun 2020

16:00 - 16:45
Virtual

Expanders and generalisations

Ana Khurkho
(University of Cambridge)
Further Information

Part of the UK virtual operator algebras seminar: https://sites.google.com/view/uk-operator-algebras-seminar/home 

Abstract

After recalling some motivation for studying highly-connected graphs in the context of operator algebras and large-scale geometry, we will introduce the notion of "asymptotic expansion" recently defined by Li, Nowak, Spakula and Zhang. We will explore some applications of this definition, hopefully culminating in joint work with Li, Vigolo and Zhang.

Thu, 04 Jun 2020
14:00
Virtual

A Mathematical Perspective of Machine Learning

Weinan E
(Princeton University)
Abstract

The heart of modern machine learning (ML) is the approximation of high dimensional functions. Traditional approaches, such as approximation by piecewise polynomials, wavelets, or other linear combinations of fixed basis functions, suffer from the curse of dimensionality (CoD). We will present a mathematical perspective of ML, focusing on the issue of CoD. We will discuss three major issues: approximation theory and error analysis of modern ML models, dynamics and qualitative behavior of gradient descent algorithms, and ML from a continuous viewpoint. We will see that at the continuous level, ML can be formulated as a series of reasonably nice variational and PDE-like problems. Modern ML models/algorithms, such as the random feature and two-layer and residual neural network models, can all be viewed as special discretizations of such continuous problems. We will also present a framework that is suited for analyzing ML models and algorithms in high dimension, and present results that are free of CoD. Finally, we will discuss the fundamental reasons that are responsible for the success of modern ML, as well as the subtleties and mysteries that still remain to be understood.

Mon, 22 Jun 2020

16:00 - 17:00

Controlled and constrained martingale problems

Thomas Kurtz
(University of Wisconsin)
Abstract

Most of the basic results on martingale problems extend to the setting in which the generator depends on a control.  The “control” could represent a random environment, or the generator could specify a classical stochastic control problem.  The equivalence between the martingale problem and forward equation (obtained by taking expectations of the martingales) provides the tools for extending linear programming methods introduced by Manne in the context of controlled finite Markov chains to general Markov stochastic control problems.  The controlled martingale problem can also be applied to the study of constrained Markov processes (e.g., reflecting diffusions), the boundary process being treated as a control.  The talk includes joint work with Richard Stockbridge and with Cristina Costantini. 

Wed, 17 Jun 2020
10:00
Virtual

TBA

Jonathan Fruchter
(University of Oxford)
Wed, 10 Jun 2020
10:00
Virtual

TBA

Mehdi Yazdi
(University of Oxford)
Wed, 20 May 2020
16:00
Virtual

TBA

Alice Kerr
(Oxford University)
Subscribe to