11:00
An introduction to the weighted fundamental lemma II
Abstract
We shall explain what is the weighted fundamental lemma and how it is related to the truncated Hitchin fibration.
We shall explain what is the weighted fundamental lemma and how it is related to the truncated Hitchin fibration.
Ordinary homology is a geometrically defined invariant of spaces: the 0-th homology group counts the number of components; the n-th homology group counts n-cycles, which correspond to an intuitive notion of 'n-dimensional holes' in a space. K-theory, or more specifically the 0-th K-theory group, is defined in terms of vector bundles, and so also has an immediate relationship to geometry. By contrast, the n-th K-theory group is typically defined homotopy-theoretically using the black box of Bott periodicity.
I will describe a more geometric perspective on K-theory, using Z/2-graded vector bundles and bundles of modules for Clifford algebras. Along the way I will explain Clifford algebras, 2-categories, and Morita equivalence, explicitly check the purely algebraic 8-fold periodicity of the Clifford algebras, and discuss how and why this periodicity implies Bott periodicity.
The talk will not presume any prior knowledge of K-theory, Clifford algebras, Bott periodicity, or the like.
Based on joint work with Arthur Bartels and Andre Henriques
Starting from a definition of the cohomology of a group, we will define the bounded cohomology of a group. We will then show how quasi-homomorphisms lead to cocycles in the second bounded cohomology group, and use this to look at the second bounded cohomology of some of our favourite groups. If time permits we will end with some applications.
Consider a configuration of points in $d$-dimensional Euclidean space
together with a set of constraints
which fix the direction or the distance between some pairs of points.
Basic questions are whether the constraints imply that the configuration
is unique or locally unique up to congruence, and whether it is bounded. I
will describe some solutions
and partial solutions to these questions.
Consider the space M of parabolas y=ax^2+bx+c, with (a, b, c) as coordinates on M. Two parabolas generically intersect at two (possibly complex) points, and we can define a conformal structure on M by declaring two points to be null separated iff the corresponding parabolas are tangent. A simple calculation of discriminant shows that this conformal structure is flat.
In this talk (based on joint works with Godlinski and Sokolov) I shall show how replacing parabolas by rational plane curves of higher degree allows constructing curved conformal structures in any odd dimension. In dimension seven one can use this "twistor" construction to find G_2 structures in a conformal class.
We shall explain what is the weighted fundamental lemma and how it is related to the truncated Hitchin fibration.
In this talk we describe some recent work on shock
reflection problems for the potential flow equation. We will
start with discussion of shock reflection phenomena. Then we
will describe the results on existence, structure and
regularity of global solutions to regular shock reflection. The
approach is to reduce the shock reflection problem to a free
boundary problem for a nonlinear elliptic equation, with
ellipticity degenerate near a part of the boundary (the sonic
arc). We will discuss techniques to handle such free boundary
problems and degenerate elliptic equations. This talk is based
on joint works with Gui-Qiang Chen, and with Myoungjean Ba
Whether as the sudoku puzzles of popular culture or as
restricted coloring problems on graphs or hypergraphs, completing partial
Latin squares and cubes present a framework for a variety of intriguing
problems. In this talk we will present several recent results on
completing partial Latin squares and cubes.
In addition to existence, the excess-demand approach allows us to establish uniqueness and provide efficient computational algorithms for various complete- and incomplete-market stochastic financial equilibria.
A particular attention will be paid to the case when the agents exhibit constant absolute risk aversion. An overview of recent results (including those jointly obtained with M. Anthropelos and with Y. Zhao) will be given.
This will discuss the paper of Ricci, Tseytlin & Wolf from 2007.
'Compressive sampling' is a topic of current interest. It relies on data being sparse in some domain, which allows what is apparently 'sub Nyquist' sampling so that the quantities of data which must be handled become more closely related to the information rate. This principal would appear to have (at least) three applications for radar and electronic warfare: \\
The most modest application is to reduce the amount of data which we must handle: radar and electronic warfare receivers generate vast amounts of data (up to 1Gbit/second or even 10Gbit.sec). It is desirable to be able to store this data for future analysis and it is also becoming increasingly important to be able to share it between different sensors, which, prima facie, requires vast communication bandwidths and it would be valuable to be able to find ways to handle this more efficiently. \\
The second advantage is that if suitable data domains can be identified, it may also be possible to pre-process the data before the analogue to digital converters in the receivers, to reduce the demands on these critical components. \\
The most ambitious use of compressive sensing would be to find ways of modifying the radar waveforms, and the electronic warfare receiver sampling strategies, to change the domain in which the information is represented to reduce the data rates at the receiver 'front ends', i.e. make the data at the front end better match the information we really want to acquire.\\
The aim of the presentation will be to describe the issues with which we are faced, and to discuss how compressive sampling might be able to help. A particular issue which will be raised is how we might find domains in which the data is sparse.
Standard quantum logic, as intitiated by Birkhoff and von Neumann, suffers from severe problems which relate quite directly to interpretational issues in the foundations of quantum theory. In this talk, I will present some aspects of the so-called topos approach to quantum theory, as initiated by Isham and Butterfield, which aims at a mathematical reformulation of quantum theory and provides a new, well-behaved form of quantum logic that is based upon the internal logic of a certain (pre)sheaf topos.
Bloch Floquet waves are considered in structured media. Such waves are dispersive and the dispersion diagrams contain stop bands. For an example of a harmonic lattice, we discuss dynamic band gap Green’s functions characterised by exponential localisation. This is followed by simple models of exponentially localised defect modes. Asymptotic models involving uniform asymptotic approximations of physical fields in structured media are compared with homogenisation approximations.
I will discuss what is known about the cohomology of several moduli spaces coming from algebraic and differential geometry. These are: moduli spaces of non-singular curves (= Riemann surfaces) $M_g$, moduli spaces of nodal curves $\overline{M}_g$, moduli spaces of holomorphic line bundles on curves $Hol_g^k \to M_g$, and the universal Picard varieties $Pic^k_g \to M_g$. I will construct characteristic classes on these spaces, talk about their homological stability, and try to explain why the constructed classes are the only stable ones. If there is time I will also talk about the Picard groups of these moduli spaces.
Much of this work is due to other people, but some is joint with J. Ebert.