Fri, 25 Nov 2022

16:00 - 17:00
L1

Maths Meets Stats

Matthew Buckland (Statistics) and Ofir Gorodetsky (North Wing)
Abstract

Matthew Buckland 
Branching Interval Partition Diffusions

We construct an interval-partition-valued diffusion from a collection of excursions sampled from the excursion measure of a real-valued diffusion, and we use a spectrally positive Lévy process to order both these excursions and their start times. At any point in time, the interval partition generated is the concatenation of intervals where each excursion alive at that point contributes an interval of size given by its value. Previous work by Forman, Pal, Rizzolo and Winkel considers self-similar interval partition diffusions – and the key aim of this work is to generalise these results by dropping the self-similarity condition. The interval partition can be interpreted as an ordered collection of individuals (intervals) alive that have varying characteristics and generate new intervals during their finite lifetimes, and hence can be viewed as a class of Crump-Mode-Jagers-type processes.

 

 

Ofir Gorodetsky
Smooth and rough numbers


We all know and love prime numbers, but what about smooth and rough numbers?
We'll define y-smooth numbers -- numbers whose prime factors are all less than y. We'll explain their application in cryptography, specifically to factorization of integers.
We'll shed light on their density, which is modelled using a peculiar differential equation. This equation appears naturally in probability theory.
We'll also explain the dual notion to smooth numbers, that of y-rough numbers: numbers whose prime factors are all bigger than y, and in some sense generalize primes.
We'll explain their importance in sieve theory. Like smooth numbers, their density has interesting properties and will be surveyed.

 

Fri, 11 Nov 2022

16:00 - 17:00
L1

Managing your supervisor

Eva Antonopoulou
Abstract

Your supervisor is the person you will interact with on a scientific level most of all during your studies here. As a result, it is vital that you establish a good working relationship. But how should you do this? In this session we discuss tips and tricks for getting the most out of your supervisions to maximize your success as a researcher. Note that this session will have no faculty in the audience in order to allow people to speak openly about their experiences. 

Fri, 04 Nov 2022

16:00 - 17:00
L1

Illustrating Mathematics

Joshua Bull and Christoph Dorn
Abstract

What should we be thinking about when we're making a diagram for a paper? How do we help it to express the right things? Or make it engaging? What kind of colour palette is appropriate? What software should we use? And how do we make this process as painless as possible? Join Joshua Bull and Christoph Dorn for a lively Fridays@4 session on illustrating mathematics, as they share tips, tricks, and their own personal experiences in bringing mathematics to life via illustrations.

Mon, 24 Apr 2023

14:00 - 15:00
Lecture Room 6

Fundamental limits of generative AI

Helmut Bölcskei
(ETH Zurich)
Abstract


Generative AI has seen tremendous successes recently, most notably the chatbot ChatGPT and the DALLE2 software creating realistic images and artwork from text descriptions. Underlying these and other generative AI systems are usually neural networks trained to produce text, images, audio, or video from text inputs. The aim of this talk is to develop an understanding of the fundamental capabilities of generative neural networks. Specifically and mathematically speaking, we consider the realization of high-dimensional random vectors from one-dimensional random variables through deep neural networks. The resulting random vectors follow prescribed conditional probability distributions, where the conditioning represents the text input of the generative system and its output can be text, images, audio, or video. It is shown that every d-dimensional probability distribution can be generated through deep ReLU networks out of a 1-dimensional uniform input distribution. What is more, this is possible without incurring a cost—in terms of approximation error as measured in Wasserstein-distance—relative to generating the d-dimensional target distribution from d independent random variables. This is enabled by a space-filling approach which realizes a Wasserstein-optimal transport map and elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we show that the number of bits needed to encode the corresponding generative networks equals the fundamental limit for encoding probability distributions (by any method) as dictated by quantization theory according to Graf and Luschgy. This result also characterizes the minimum amount of information that needs to be extracted from training data so as to be able to generate a desired output at a prescribed accuracy and establishes that generative ReLU networks can attain this minimum.

This is joint work with D. Perekrestenko and L. Eberhard



 

Fri, 28 Oct 2022

16:00 - 17:00
L1

North Meets South

Ilia Smilga and Charles Parker
Abstract

Ilia Smilga
Margulis spacetimes and crooked planes

We are interested in the following problem: which groups can act 
properly on R^n by affine transformations, or in other terms, can occur 
as a symmetry group of a "regular affine tiling"? If we additionally 
require that they preserve a Euclidean metric (i.e. act by affine 
isometries), then these groups are well-known: they all contain a 
finite-index abelian subgroup. If we remove this requirement, a 
surprising result due to Margulis is that the free group can act 
properly on R^3. I shall explain how to construct such an action.

 

Charles Parker
Unexpected Behavior in Finite Elements for Linear Elasticity
One of the first problems that finite elements were designed to approximate is the small deformations of a linear elastic body; i.e. the 2D/3D version of Hooke's law for springs from elementary physics. However, for nearly incompressible materials, such as rubber, certain finite elements seemingly lose their approximation power. After briefly reviewing the equations of linear elasticity and the basics of finite element methods, we will spend most of the time looking at a few examples that highlight this unexpected behavior. We conclude with a theoretical result that (mostly) explains these findings.

 

 

Fri, 21 Oct 2022

16:00 - 17:00
L1

Maintaining your mental fitness as a graduate student or postdoc

Rebecca Reed and Ian Griffiths
Abstract

Academic research can be challenging and can bring with it difficulties in maintaining good mental health. This session will be led by Rebecca Reed, Mental Health First Aid (MHFA) Instructor, Meditation & Yoga Teacher and Personal Development Coach and owner of wellbeing company Siendo. Rebecca will talk about how we can maintain good mental fitness, recognizing good practices to ensure we avoid mental-health difficulties before they begin. We have deliberately set this session to be at the beginning of the academic year in this spirit. We will also talk about maintaining good mental health specifically in the academic community.   

Mon, 06 Mar 2023

14:00 - 15:00
L6

A Matrix-Mimetic Tensor Algebra for Optimal Representations of Multiway Data

Elizabeth Newman
(Emory University )
Abstract

The data revolution has changed the landscape of computational mathematics and has increased the demand for new numerical linear algebra tools to handle the vast amount of data. One crucial task is data compression to capture the inherent structure of data efficiently. Tensor-based approaches have gained significant traction in this setting by exploiting multilinear relationships in multiway data. In this talk, we will describe a matrix-mimetic tensor algebra that offers provably optimal compressed representations of high-dimensional data. We will compare this tensor-algebraic approach to other popular tensor decomposition techniques and show that our approach offers both theoretical and numerical advantages.

Fri, 14 Oct 2022

16:00 - 17:00
L1

Meet and Greet Event

Amy Kent and Ellen Luckins
Abstract

Abstract: 

Welcome (back) to Fridays@4! To start the new academic year in this session we’ll introduce what Fridays@4 is for our new students and colleagues. This session will be a chance to meet current students and ECRs from across Maths and Stats who will share their hints and tips on conducting successful research in Oxford. There will be lots of time for questions, discussions and generally meeting more people across the two departments – everyone is welcome!

 

Mon, 21 Nov 2022
14:00
L4

Dirac synchronization and Dirac Signal Processing

Ginestra Bianconi
(Queen Mary University of London)
Abstract

Topological signals associated not only to nodes but also to links and to the higher dimensional simplices of simplicial complexes are attracting increasing interest in signal processing, machine learning and network science. However, little is known about the collective dynamical phenomena involving topological signals. Typically, topological signals of a given dimension are investigated and filtered using the corresponding Hodge Laplacians. In this talk, I will introduce the topological Dirac operator that can be used to process simultaneously topological signals of different dimensions.  I will discuss the main spectral properties of the Dirac operator defined on networks, simplicial complexes and multiplex networks, and their relation to Hodge Laplacians.   I will show that topological signals treated with the Hodge Laplacians or with the Dirac operator can undergo collective synchronization phenomena displaying different types of critical phenomena. Finally, I will show how the Dirac operator allows to couple the dynamics of topological signals of different dimension leading to the Dirac signal processing of signals defined on nodes, links and triangles of simplicial complexes. 

Thu, 06 Oct 2022

12:00 - 13:00
L2

Some Entropy Rate Approaches in Continuum Mechanics

Prof. Hamid Said
(Kuwait University)
Abstract

Irreversible processes are accompanied by an increase in the internal entropy of a continuum, and as such the entropy production function is fundamental in determining the overall state of the system. In this talk, it will be shown that the entropy production function can be utilized for a variational analysis of certain dissipative continua in two different ways. Firstly, a novel unified Lagrangian-Hamiltonian formalism is constructed giving phase space extra structure, and applied to the study of fluid flow and brittle fracture.  Secondly, a maximum entropy production principle is presented for simple bodies and its implications to the study of fluid flow discussed. 

Subscribe to