Mathematrix: Beating the Winter Blues
Abstract
We will be joined by Professor Kobi Kremnizer, who is a trained mental health first-aider, to discuss ways to protect your mental health this season.
We will be joined by Professor Kobi Kremnizer, who is a trained mental health first-aider, to discuss ways to protect your mental health this season.
In view of Takesaki-Takai duality, we can go back and forth between C*-dynamical systems of an abelian group and ones of its Pontryagin dual by taking crossed products. In this talk, I present a similar duality between actions on C*-algebras of two constructions of locally compact quantum groups: one is the bicrossed product due to Vaes-Vainerman, and the other is the double crossed product due to Baaj-Vaes. I will explain the situation by illustrating the example coming from groups. If time permits, I will also discuss its consequences in the case of quantum doubles.
A metric is said to be (globally) median, if any three points have a unique “median” which lies between any two points from the triple.
Such spaces arise naturally in many different contexts. The property of being locally median can be viewed as a kind of
non-positive curvature condition. We show that a complete uniformly locally median space is
globally median if and only if it is simply connected. This is an analogue of the well known Cartan-Hadamard Theorem for non-positively curved manifolds, or more generally CAT(0) spaces. However it leaves open a number of interesting questions.
We discuss the Local Langlands correspondence in connection with the Bernstein center and the Stable Bernstein center. We also give an example of stable Bernstein center as a stable essentially compact invariant distribution.
We consider a bond percolation on the hypercube in the supercritical regime. We derive vertex-expansion properties of the giant component. As a consequence we obtain upper bounds on the diameter of the giant component and the mixing time of the lazy random walk on the giant component. This talk is based on joint work with Joshua Erde and Michael Krivelevich.
I will present a model for an optimal portfolio allocation and consumption problem for a portfolio composed of a risk-free bond and two illiquid assets. Two forms of illiquidity are presented, both illiquidities based on Lévy processes. The goal of the investor is to maximise a certain utility function, and the optimal utility is found as a solution of a nonlinear PIDE of the Hamilton-Jacobi-Bellman kind.
The Dedekind zeta function generalises the Riemann zeta
function to other number fields than the rationals. The analytic class number
formula says that the leading term of the Dedekind zeta function is a
product of invariants of the number field. I will say some things
about the class number formula, about L-functions, and about Stark's
conjecture which generalises the class number formula.
The signature is a non-commutative exponential that appeared in the foundational work of K-T Chen in the 1950s. It is also a fundamental object in the theory of rough paths (Lyons, 1998). More recently, it has been proposed, and used, as part of a practical methodology to give a way of summarising multimodal, possibly irregularly sampled, time-ordered data in a way that is insensitive to its parameterisation. A key property underpinning this approach is the ability of linear functionals of the signature to approximate arbitrarily any compactly supported and continuous function on (unparameterised) path space. We present some new results on the properties of a selection of topologies on the space of unparameterised paths. We discuss various applications in this context.
This is based on joint work with William Turner.
The chromatic polynomial \chi(Q) can be defined for any graph, such that for Q integer it counts the number of colourings. It has many remarkable properties, and I describe several that are derived easily by using fusion categories, familiar from topological quantum field theory. In particular, I define the chromatic algebra, a planar algebra whose evaluation gives the chromatic polynomial. Linear identities of the chromatic polynomial at certain values of Q then follow from the Jones-Wenzl projector of the associated category. An unusual non-linear one called Tutte's golden identity relates \chi(\phi+2) for planar triangulations to the square of \chi(\phi+1), where \phi is the golden mean. Tutte's original proof is purely combinatorial. I will give here an elementary proof by manipulations of a topological invariant related to the Jones polynomial. Time permitting, I will also mention analogous identities for graphs on more general surfaces. Based on work with Slava Krushkal.
The moduli space M of stable bundles on a Riemann surface possesses a natural family of holomorphic trivector fields. The talk will introduce these objects with examples and then use them to gain information about the Hochschild cohomology of M.
3d N=4 SCFTs contain a 1d topological sector of twisted linear
combinations of half-BPS local operators inserted along a line. I will
explain how to construct analogous 1d topological sectors on the
three-sphere and in particular show how these sectors are preserved under
the squashing of the sphere. Furthermore, I will show how to introduce FI
parameters and real masses in the 3d N=4 theory and demonstrate how such
deformations can be translated in universal deformations of the
corresponding 1d theory. Finally, I will discuss a series of applications
and future prospects.
K-homology is the dual theory to K-theory for C*-algebras. I will show how under appropriate quasi-diagonality and countability assumptions K-homology (more generally, KK-theory) can be realized by completely positive and contractive, and approximately multiplicative, maps to matrix algebras modulo an appropriate equivalence relation. I’ll briefly explain some connections to manifold topology and existence / uniqueness theorems in C*-algebra classification theory (due to Dadarlat and Eilers).
Some of this is based on joint work with Guoliang Yu, and some is work in progress
While scattering problems are posed on unbounded domains, volumetric discretizations typically require truncating the domain at a finite distance, closing the system with some sort of boundary condition. These conditions typically suffer from some deficiency, such as perturbing the boundary value problem to be solved or changing the character of the operator so that the discrete system is difficult to solve with iterative methods.
We introduce a new technique for the Helmholtz problem, based on using the Green formula representation of the solution at the artificial boundary. Finite element discretization of the resulting system gives optimal convergence estimates. The resulting algebraic system can be solved effectively with a matrix-free GMRES implementation, preconditioned with the local part of the operator. Extensions to the Morse-Ingard problem, a coupled system of pressure/temperature equations arising in modeling trace gas sensors, will also be given.
Oxford Mathematics Christmas Public Lecture
In this talk we'll look at a variety of delicious delights through a lens of fluid dynamics and mathematical modelling. From perfect roast potatoes to sweet sauces, mathematics gets everywhere!
Helen Wilson is Head of the Department of Mathematics at UCL. She is best known for her work on the chocolate fountain (which will feature in this lecture) but does do serious mathematical modelling as well.
Please email @email to register. The lecture will be followed by mince pies and drinks for all.
This lecture will be available on our Oxford Mathematics YouTube Channel at 5pm on 20th December.
The Oxford Mathematics Public Lectures are generously supported by XTX Markets.
Minimal submanifolds are the critical points of the volume functional. If the second derivative of the volume is nonnegative, we say that such a minimal submanifold is stable.
After reviewing some basics of minimal submanifolds in a generic Riemannian manifold, I will give some motivations behind the Lawson--Simons conjecture, which claims that there are no stable minimal submanifolds in 1/4-pinched spheres. Finally, I will discuss my recent work with Giada Franz on the nonexistence of stable minimal submanifolds in conformal pinched spheres.
2:00 Julian Sieber
On the (Non-)stationary density of fractional SDEs
I will present a novel approach for studying the density of SDEs driven by additive fractional Brownian motion. It allows us to establish smoothness and Gaussian-type upper and lower bounds for both the non-stationary as well as the stationary density. While the stationary density has not been studied in any previous works, the former was the subject of multiple articles by Baudoin, Hairer, Nualart, Ouyang, Pillai, Tindel, among others. The common theme of all of these works is to obtain the results through bounds on the Malliavin derivative. The main disadvantage of this approach lies in the non-optimal regularity conditions on the SDE's coefficients. In case of additive noise, the equation is known to be well-posed if the drift is merely sublinear and measurable (resp. Holder continuous). Relying entirely on classical methods of stochastic analysis (avoiding any Malliavin calculus), we prove the aforementioned Gaussian-type bounds under optimal regularity conditions.
The talk is based on a joint work with Xue-Mei Li and Fabien Panloup.
2:45 Thomas Tendron
A central limit theorem for a spatial logistic branching process in the slow coalescence regime
We study the scaling limits of a spatial population dynamics model which describes the sizes of colonies located on the integer lattice, and allows for branching, coalescence in the form of local pairwise competition, and migration. When started near the local equilibrium, the rates of branching and coalescence in the particle system are both linear in the local population size - we say that the coalescence is slow. We identify a rescaling of the equilibrium fluctuations process under which it converges to an infinite dimensional Ornstein-Uhlenbeck process with alpha-stable driving noise if the offspring distribution lies in the domain of attraction of an alpha-stable law with alpha between one and two.
3:30 Break
4:00-5:30 Careers Discussion
Immersive Finance, Founder, and Oxford Mathematics, Visiting Lecturer in Mathematical Finance |
Oxford Mathematics, Professor of Numerical Optimisation |
Smith Institute, Chief Technical Officer |
Tesco, Data Science Manager |
A pair of elliptic curves is said to be $N$-congruent if their mod $N$ Galois representations are isomorphic. We will discuss a construction of the moduli spaces of $N$-congruent elliptic curves, due to Kani--Schanz, and describe how this can be exploited to compute explicit equations. Finally we will outline a proof that there exist infinitely many pairs of elliptic curves with isomorphic mod $12$ Galois representations, building on previous work of Chen and Fisher (in the case where the underlying isomorphism of torsion subgroups respects the Weil pairing).
The strong cosmic censorship conjecture is a fundamental open problem in classical general relativity, first put forth by Roger Penrose in the early 70s. This is essentially the question of whether general relativity is a deterministic theory. Perhaps the most exciting arena where the validity of the conjecture is challenged is the interior of rotating black holes, and there has been a lot of work in the past 50 years in identifying mechanisms ensuring that at least some formulation of the conjecture be true. It turns out that when a nonzero cosmological constant Λ is added to the Einstein equations, these underlying mechanisms change in an unexpected way, and the validity of the conjecture depends on a detailed understanding of subtle aspects of black hole scattering theory, surprisingly involving, in the case of negative Λ, some number theory. Does strong cosmic censorship survive the challenge of non-zero Λ? This talk will try to address this Question!
You can find out more about Professor De Loera here: https://www.math.ucdavis.edu/~deloera/
In this talk I explain the fertile relationship between the foundations of inference and learning and combinatorial geometry.
My presentation contains several powerful examples where famous theorems in discrete geometry answered natural questions from machine learning and statistical inference:
In this tasting tour I will include the problem of deciding the existence of Maximum likelihood estimator in multiclass logistic regression, the variability of behavior of k-means algorithms with distinct random initializations and the shapes of the clusters, and the estimation of the number of samples in chance-constrained optimization models. These obviously only scratch the surface of what one could do with extra free time. Along the way we will see fascinating connections to the coupon collector problem, topological data analysis, measures of separability of data, and to the computation of Tukey centerpoints of data clouds (a high-dimensional generalization of median). All new theorems are joint work with subsets of the following wonderful folks: T. Hogan, D. Oliveros, E. Jaramillo-Rodriguez, and A. Torres-Hernandez.
Two relevant papers published/ to appear are
https://arxiv.org/abs/1907.09698https://arxiv.org/abs/1907.09698
https://arxiv.org/abs/2205.05743https://arxiv.org/abs/2205.05743
Fluids sculpt many of the shapes we see in the world around us. We present a new mathematical model describing the shape evolution of a body that dissolves or melts under gravitationally stable buoyancy-driven convection, driven by thermal or solutal transfer at the solid-fluid interface. For high Schmidt number, the system is reduced to a single integro-differential equation for the shape evolution. Focusing on the particular case of a cone, we derive complete predictions for the underlying self-similar shapes, intrinsic scales and descent rates. We will present the results of new laboratory experiments, which show an excellent match to the theory. By analysing all initial power-law shapes, we uncover a surprising result that the tips of melting or dissolving bodies can either sharpen or blunt with time subject to a critical condition.
Let $X$ denote an open subset of $\mathbb{C}^d$, and $\mathcal{O}$ its sheaf of holomorphic functions. In the 1970’s, Ishimura studied the morphisms of sheaves $P\colon\mathcal{O}\to\mathcal{O}$ of $\mathbb{C}$-vector spaces which are continuous, that is the maps $P(U)\colon\mathcal{O}(U)\to\mathcal{O}(U)$ on the sections are continuous. In this talk, we explain his result, and explore its analogues in the non-Archimedean world.
Consider an environment with two vehicles/platforms moving at a relative velocity (v). The objective is to predict the Closest Point of Approach (CPA) between the two platforms as defined by the parameters: CPA time (t0), CPA bearing (θ0), CPA distance (r0)[†].The challenge is to identify mathematical operations - either using geometric methods, or by use of tracking algorithms such as Kalman Filters (EKF, UKF), or a combination of both - to estimate the CPA parameters. The statistical errors in estimation of CPA parameters also need to be quantified with each observations at time ti. The signals to be employed are acoustic in nature and the receiver platform has one sensor. The parameters that can extracted from acoustic signals are current relative bearing (θ) and current doppler or range rate (S)
[†]Defined currently using polar coordinate system.
Note: we would recommend to join the meeting using the Teams client for best user experience.
Modern Data Assimilation (DA) can be traced back to the sixties and owes a lot to earlier developments in linear filtering theory. Since then, DA has evolved independently of Filtering Theory. To-date it is a massively important area of research due to its many applications in meteorology, ocean prediction, hydrology, oil reservoir exploration, etc. The field has been largely driven by practitioners, however in recent years an increasing body of theoretical work has been devoted to it. In this talk, In my talk, I will advocate the interpretation of DA through the language of stochastic filtering. This interpretation allows us to make use of advanced particle filters to produce rigorously validated DA methodologies. I will present a particle filter that incorporates three additional add-on procedures: nudging, tempering and jittering. The particle filter is tested on a two-layer quasi-geostrophic model with O(10^6) degrees of freedom out of which only a minute fraction are noisily observed.