Past Colloquia

12 February 2016
16:00
Isabelle Gallagher
Abstract

The question of deriving Fluid Mechanics equations from deterministic
systems of interacting particles obeying Newton's laws, in the limit
when the number of particles goes to infinity, is a longstanding open
problem suggested by Hilbert in his 6th problem. In this talk we shall
present a few attempts in this program, by explaining how to derive some
linear models such as the Heat, acoustic and Stokes-Fourier equations.
This corresponds to joint works with Thierry Bodineau and Laure Saint
Raymond.

29 January 2016
16:00
Mark Newman
Abstract

Most networks and graphs encountered in empirical studies, including internet and web graphs, social networks, and biological and ecological networks, are very sparse.  Standard spectral and linear algebra methods can fail badly when applied to such networks and a fundamentally different approach is needed.  Message passing methods, such as belief propagation, offer a promising solution for these problems.  In this talk I will introduce some simple models of sparse networks and illustrate how message passing can form the basis for a wide range of calculations of their structure.  I will also show how message passing can be applied to real-world data to calculate fundamental properties such as percolation thresholds, graph spectra, and community structure, and how the fixed-point structure of the message passing equations has a deep connection with structural phase transitions in networks.

20 November 2015
16:00
Felix Otto
Abstract
Heterogeneous media, like a sediment, are often naturally described in statistical terms.  How to extract their effective behaviour on large scales, like the permeability in Darcy's law, from the statistical specifications?  A practioners numerical approach is to sample the medium according to these specifications and to determine the permeability in the Cartesian directions by imposing simple boundary conditions.  What is the error made in terms of the size of this "representative volume element''?  Our interest in what is called  "stochastic homogenization'' grew out of this error analysis.

 

In the course of developing such an error analysis, connections with the classical regularity theory for elliptic operators have emerged. It turns out that the randomness, in conjunction with statistical homogeneity, of the coefficient field (which can be seen as a Riemannian metric) generates large-scale regularity of harmonic functions (w.r.t. the corresponding Laplace-Beltrami operator).  This is embodied by a hierarchy of Liouville properties:
 
   Almost surely, the space of harmonic functions of given but arbitrary growth rate has the same dimension as in the flat (i. e. Euclidean) case.

 

  Classical examples show that from a deterministic point of view, the Liouville property fails already for a small growth rate:

 

  There are (smooth) coefficient fields, which correspond to the geometry of a cone at infinity, that allow for sublinearly growing but non-constant harmonic functions.
 
 
 
19 June 2015
16:00
Gunnar Carlsson
Abstract

There has been a great deal of attention paid to "Big Data" over the last few years.  However, often as not, the problem with the analysis of data is not as much the size as the complexity of the data.  Even very small data sets can exhibit substantial complexity.  There is therefore a need for methods for representing complex data sets, beyond the usual linear or even polynomial models.  The mathematical notion of shape, encoded in a metric, provides a very useful way to represent complex data sets.  On the other hand, Topology is the mathematical sub discipline which concerns itself with studying shape, in all dimensions.  In recent years, methods from topology have been adapted to the study of data sets, i.e. finite metric spaces.  In this talk, we will discuss what has been
done in this direction and what the future might hold, with numerous examples.

22 May 2015
16:30
Andre Henriques
Abstract

I will review Bott's classical periodicity result about topological K-theory (with period 2 in the case of complex K-theory, and period 8 in the case of real K-theory), and provide an easy sketch of proof, based on the algebraic periodicity of Clifford algebras. I will then introduce the `higher real K-theory' of Hopkins and Miller, also known as TMF. I'll discuss its periodicity (with period 576), and present a conjecture about a corresponding algebraic periodicity of `higher Clifford algebras'. Finally, applications to physics will be discussed.

1 May 2015
16:30
Martin Hairer
Abstract

Some physical and mathematical theories have the unfortunate feature that if one takes them at face value, many quantities of interest appear to be infinite! Various techniques, usually going under the common name of “renormalisation” have been developed over the years to address this, allowing mathematicians and physicists to tame these infinities. We will tip our toes into some of the mathematical aspects of these techniques and we will see how they have recently been used to make precise analytical statements about the solutions of some equations whose meaning was not even clear until recently.

13 March 2015
16:30
Professor Jorge Nocedal
Abstract

Optimization methods for large-scale machine learning must confront a number of challenges that are unique to this discipline. In addition to being scalable, parallelizable and capable of handling nonlinearity (even non-convexity), they must also be good learning algorithms. These challenges have spurred a great amount of research that I will review, paying particular attention to variance reduction methods. I will propose a new algorithm of this kind and illustrate its performance on text and image classification problems.

6 March 2015
16:30
Abstract

Based upon our joint work with M. Marcolli, I will introduce some algebraic geometric models in cosmology related to the "boundaries" of space-time: Big Bang, Mixmaster Universe, and Roger Penrose's crossovers between aeons. We suggest to model the kinematics of Big Bang using the algebraic geometric (or analytic) blow up of a point $x$. This creates a boundary  which consists of the projective space of tangent directions to $x$ and possibly of the light cone of $x$. We argue that time on the boundary undergoes the Wick rotation and becomes purely imaginary. The Mixmaster (Bianchi IX) model of the early history of the universe is neatly explained in this picture by postulating that the reverse Wick rotation follows a hyperbolic geodesic connecting imaginary time axis to the real one. Roger Penrose's idea to see the Big Bang as a sign of crossover from "the end of the previous aeon" of the expanding and cooling Universe to the "beginning of the next aeon" is interpreted as an identification of a natural boundary of Minkowski space at infinity with the Bing Bang boundary.

21 November 2014
16:30
Abstract

Quantum Mechanics presents a radically different perspective on physical reality compared with the world of classical physics. In particular, results such as the Bell and Kochen-Specker theorems highlight the essentially non-local and contextual nature of quantum mechanics. The rapidly developing field of quantum information seeks to exploit these non-classical features of quantum physics to transcend classical bounds on information processing tasks.

In this talk, we shall explore the rich mathematical structures underlying these results. The study of non-locality and contextuality can be expressed in a unified and generalised form in the language of sheaves or bundles, in terms of obstructions to global sections. These obstructions can, in many cases, be witnessed by cohomology invariants. There are also strong connections with logic. For example, Bell inequalities, one of the major tools of quantum information and foundations, arise systematically from logical consistency conditions.

These general mathematical characterisations of non-locality and contextuality also allow precise connections to be made with a number of seemingly unrelated topics, in classical computation, logic, and natural language semantics. By varying the semiring in which distributions are valued, the same structures and results can be recognised in databases and constraint satisfaction as in probability models arising from quantum mechanics. A rich field of contextual semantics, applicable to many of the situations where the pervasive phenomenon of contextuality arises, promises to emerge.

Pages