Wed, 04 Dec 2024
16:00
L6

Tambara-Yamagami Fusion Categories

Adrià Marín-Salvador
(University of Oxford)
Abstract

In this talk, I will introduce fusion categories as categorical versions of finite rings. We will discuss some examples which may already be familiar, like the category of representations of a finite group and the category of vector spaces graded over a finite group. Then, we will define Tambara-Yamagami categories, which are a certain type of fusion categories which have one simple object which is non-invertible. I will provide the classification results of Tambara and Yamagami on these categories and give some small examples. Time permitting, I will discuss current work in progress on how to generalize Tambara-Yamagami fusion categories to locally compact groups. 

This talk will not assume familiarity with category theory further than the definition of a category and a functor.

Immersions of Directed Graphs in Tournaments
Girão, A Hancock, R Random Structures and Algorithms volume 66 issue 1 (20 Jan 2025)
B-twisted Gaiotto-Witten theory and topological quantum field theory
Garner, N Geer, N Young, M Communications in Mathematical Physics (11 Jan 2025)
Mirror Symmetry and Level-rank Duality for 3d $\mathcal{N} = 4$ Rank 0 SCFTs
Creutzig, T Garner, N Kim, H (31 May 2024)
Scattering off of Twistorial Line Defects
Garner, N Paquette, N Journal of High Energy Physics (JHEP)
Tue, 11 Mar 2025
14:00
L6

Gelfand--Kirillov dimension and mod p cohomology for quaternion algebras

Andrea Dotto
(King's College London)
Abstract

The Gelfand--Kirillov dimension is a classical invariant that measures the size of smooth representations of p-adic groups. It acquired particular relevance in the mod p Langlands program because of the work of  Breuil--Herzig--Hu--Morra--Schraen, who computed it for the mod p cohomology of GL_2 over totally real fields, and used it to prove several structural properties of the cohomology. In this talk, we will present a simplified proof of this result, which has the added benefit of working unchanged for nonsplit inner forms of GL_2. This is joint work with Bao V. Le Hung.

Mon, 16 Jun 2025
15:30
L3

Kinetic Optimal Transport

Prof Jan Maas
(IST Austria)
Abstract

We present a kinetic version of the optimal transport problem for probability measures on phase space. The central object is a second-order discrepancy between probability measures, analogous to the 2-Wasserstein distance, but based on the minimisation of the squared acceleration. We discuss the equivalence of static and dynamical formulations and characterise absolutely continuous curves of measures in terms of reparametrised solutions to the Vlasov continuity equation. This is based on joint work with Giovanni Brigati (ISTA) and Filippo Quattrocchi (ISTA).

Mon, 02 Jun 2025
15:30
L3

Variance renormalisation of singular SPDEs

Dr Máté Gerencsér
(TU Wien )
Abstract

Scaling arguments give a natural guess at the regularity condition on the noise in a stochastic PDE for a local solution theory to be possible, using the machinery of regularity structures or paracontrolled distributions. This guess of ``subcriticality'' is often, but not always, correct. In cases when it is not, a the blowup of the variance of certain nonlinear functionals of the noise necessitates a different, multiplicative renormalisation. This led to a general prediction and the first results in the case of the KPZ equation in [Hairer '24]. We discuss recent developments towards confirming this prediction. Based on joint works with Fabio Toninelli and Yueh-Sheng Hsu.

Mon, 19 May 2025
15:30
L3

Quantitative Convergence of Deep Neural Networks to Gaussian Processes

Prof Dario Trevisan
(University of Pisa)
Abstract

In this seminar, we explore the quantitative convergence of wide deep neural networks with Gaussian weights to Gaussian processes, establishing novel rates for their Gaussian approximation. We show that the Wasserstein distance between the network output and its Gaussian counterpart scales inversely with network width, with bounds apply for any finite input set under specific non-degeneracy conditions of the covariances. Additionally, we extend our analysis to the Bayesian framework, by studying exact posteriors for neural networks, when endowed with Gaussian priors and regular Likelihood functions, but we also provide recent advancements in quantitative approximation of trained networks via gradient descent in the NTK regime. Based on joint works with A. Basteri, and A. Agazzi and E. Mosig.

Fri, 06 Dec 2024
16:00
L1

Fridays@4 – A start-up company? 10 things I wish I had known

Professor Peter Grindrod
(Mathematical Institute (University of Oxford))
Abstract

Are you thinking of launching your own start-up or considering joining an early-stage company? Navigating the entrepreneurial landscape can be both exciting and challenging. Join Pete for an interactive exploration of the unwritten rules and hidden insights that can make or break a start-up journey.

Drawing from personal experience, Pete's talk will offer practical wisdom for aspiring founders and team members, revealing the challenges and opportunities of building a new business from the ground up.

Whether you're an aspiring entrepreneur, a potential start-up team member, or simply curious about innovative businesses, you'll gain valuable perspectives on the realities of creating something from scratch.

This isn't a traditional lecture – it will be a lively conversation that invites participants to learn, share, and reflect on the world of start-ups. Come prepared to challenge your assumptions and discover practical insights that aren't found in standard business guides.
 

A Start-Up Company? Ten Things I Wish I Had Known


Speaker: Professor Pete Grindrod

Subscribe to