Optimal Lockdown Policy for Covid-19: A Modelling Study
Abstract
TBC
TBC
Contact organisers for access to meeting (Carmen Jorge-Diaz, Connor Behan or Sujay Nair)
Conventional computational methods often create a dilemma for fluid-structure interaction problems. Typically, solids are simulated using a Lagrangian approach with grid that moves with the material, whereas fluids are simulated using an Eulerian approach with a fixed spatial grid, requiring some type of interfacial coupling between the two different perspectives. Here, a fully Eulerian method for simulating structures immersed in a fluid will be presented. By introducing a reference map variable to model finite-deformation constitutive relations in the structures on the same grid as the fluid, the interfacial coupling problem is highly simplified. The method is particularly well suited for simulating soft, highly-deformable materials and many-body contact problems, and several examples will be presented.
This is joint work with Ken Kamrin (MIT).
A link for this talk will be sent to our mailing list a day or two in advance. If you are not on the list and wish to be sent a link, please contact @email.
We continue this term with our flagship seminars given by notable scientists on topics that are relevant to Industrial and Applied Mathematics.
Note the new time of 12:00-13:00 on Thursdays.
This will give an opportunity for the entire community to attend and for speakers with childcare responsibilities to present.
Simple mathematical models have had remarkable successes in biology, framing how we understand a host of mechanisms and processes. However, with the advent of a host of new experimental technologies, the last ten years has seen an explosion in the amount and types of quantitative data now being generated. This sets a new challenge for the field – to develop, calibrate and analyse new, biologically realistic models to interpret these data. In this talk I will showcase how quantitative comparisons between models and data can help tease apart subtle details of biological mechanisms, as well as present some steps we have taken to tackle the mathematical challenges in developing models that are both identifiable and can be efficiently calibrated to quantitative data.
This talk will be an introduction to L^2 homology, which is roughly "square-summable" homology. We begin by defining the L^2 homology of a G-CW complex (a CW complex with a cellular G-action), and we will discuss some applications of these invariants to group theory and topology. We will then focus on a criterion of Wise, which proves the vanishing of the 2nd L^2 Betti number in combinatorial CW-complexes with elementary methods. If time permits, we will also introduce Wise's energy criterion.
It was implicitly conjectured by Hambly-Lyons in 2010, which was made explicit by Chang-Lyons-Ni in 2018, that the length of a tree-reduced path with bounded variation can be recovered from its signature asymptotics. Apart from its intrinsic elegance, understanding such a phenomenon is also important for the study of signature lower bounds and may shed light on more general signature inversion properties. In this talk, we discuss how the idea of path development onto suitably chosen Lie groups can be used to study this problem as well as its rough path analogue.
Oxford Mathematics Online Public Lecture in Partnership with Wadham College celebrating Roger Penrose's Nobel Prize
Spacetime Singularities - Roger Penrose, Dennis Lehmkuhl and Melvyn Bragg
Tuesday 16 February 2021
5.00-6.30pm
Dennis Lehmkuhl: From Schwarzschild’s singularity and Hadamard’s catastrophe to Penrose’s trapped surfaces
Roger Penrose: Spacetime singularities - to be or not to be?
Roger Penrose & Melvyn Bragg: In conversation
What are spacetime singularities? Do they exist in nature or are they artefacts of our theoretical reasoning? Most importantly, if we accept the general theory of relativity, our best theory of space, time, and gravity, do we then also have to accept the existence of spacetime singularities?
In this special lecture, Sir Roger Penrose, 2020 Nobel Laureate for Physics, will give an extended version of his Nobel Prize Lecture, describing his path to the first general singularity theorem of general relativity, and to the ideas that sprung from this theorem, notably the basis for the existence of Black Holes. He will be introduced by Dennis Lehmkuhl whose talk will describe how the concept of a spacetime singularity developed prior to Roger's work, in work by Einstein and others, and how much of a game changer the first singularity theorem really was.
The lectures will be followed by an interview with Roger by Melvyn Bragg.
Roger Penrose is the 2020 Nobel Laureate for Physics and Emeritus Rouse Ball Professor in Oxford; Dennis Lehmkuhl is Lichtenberg Professor of History and Philosophy of Physics at the University of Bonn and one of the Editors of Albert Einstein's Collected Papers: Melvyn Bragg is a broadcaster and author best known for his work as editor and presenter of the South Bank Show and In Our Time.
Watch online (no need to register - and the lecture will stay up on all channels afterwards):
Oxford Mathematics Twitter
Oxford Mathematics Facebook
Oxford Mathematics Livestream
Oxford Mathematics YouTube
The Oxford Mathematics Public Lecture are generously supported by XTX Markets
[[{"fid":"60543","view_mode":"media_397x223","fields":{"format":"media_397x223","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"media_397x223","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"class":"media-element file-media-397x223","data-delta":"1"}}]]
Part of the Oxford Discrete Maths and Probability Seminar, held via Zoom. Please see the seminar website for details.
It is a basic fact of convexity that the volume of convex bodies is a polynomial, whose coefficients contain many familiar geometric parameters as special cases. A fundamental result of convex geometry, the Alexandrov-Fenchel inequality, states that these coefficients are log-concave. This proves to have striking connections with other areas of mathematics: for example, the appearance of log-concave sequences in many combinatorial problems may be understood as a consequence of the Alexandrov-Fenchel inequality and its algebraic analogues.
There is a long-standing problem surrounding the Alexandrov-Fenchel inequality that has remained open since the original works of Minkowski (1903) and Alexandrov (1937): in what cases is equality attained? In convexity, this question corresponds to the solution of certain unusual isoperimetric problems, whose extremal bodies turn out to be numerous and strikingly bizarre. In combinatorics, an answer to this question would provide nontrivial information on the type of log-concave sequences that can arise in combinatorial applications. In recent work with Y. Shenfeld, we succeeded to settle the equality cases completely in the setting of convex polytopes. I will aim to describe this result, and to illustrate its potential combinatorial implications through a question of Stanley on the combinatorics of partially ordered sets.
Will a large economy be stable? In this talk, I will present a model for a network economy where firms' productions are interdependent, and study the conditions under which such input-output networks admit a competitive economic equilibrium, where markets clear and profits are zero. Insights from random matrix theory allow to understand some of the emergent properties of this equilibrium and to provide a classification for the different types of crises it can be subject to. After this, I will endow the model with dynamics, and present results with strong links to generalised Lotka-Volterra models in theoretical ecology, where inter-species interactions are modelled with random matrices and where the system naturally self-organises into a critical state. In both cases, the stationary points must consist of positive species populations/prices/outputs. Building on these ideas, I will show the key concepts behind an economic agent-based model that can exhibit convergence to equilibrium, limit cycles and chaotic dynamics, as well as a phase of spontaneous crises whose origin can be understood using "semi-linear" dynamics.
Successful navigation of the Covid-19 pandemic is predicated on public cooperation with safety measures and appropriate perception of risk, in which emotion and attention play important roles. Signatures of public emotion and attention are present in social media data, thus natural language analysis of this text enables near-to-real-time monitoring of indicators of public risk perception. We compare key epidemiological indicators of the progression of the pandemic with indicators of the public perception of the pandemic constructed from ∼20 million unique Covid-19-related tweets from 12 countries posted between 10th March and 14th June 2020. We find evidence of psychophysical numbing: Twitter users increasingly fixate on mortality, but in a decreasingly emotional and increasingly analytic tone. Semantic network analysis based on word co-occurrences reveals changes in the emotional framing of Covid-19 casualties that are consistent with this hypothesis. We also find that the average attention afforded to national Covid-19 mortality rates is modelled accurately with the Weber–Fechner and power law functions of sensory perception. Our parameter estimates for these models are consistent with estimates from psychological experiments, and indicate that users in this dataset exhibit differential sensitivity by country to the national Covid-19 death rates. Our work illustrates the potential utility of social media for monitoring public risk perception and guiding public communication during crisis scenarios.
Part of the Oxford Discrete Maths and Probability Seminar, held via Zoom. Please see the seminar website for details.
We investigate a graph theoretic analog of geodesic geometry. In a graph $G=(V,E)$ we consider a system of paths $P=\{P_{u,v}| u,v\in V\}$ where $P_{u,v}$ connects vertices $u$ and $v$. This system is consistent in that if vertices $y,z$ are in $P_{u,v}$, then the sub-path of $P_{u,v}$ between them coincides with $P_{y,z}$. A map $w:E\to(0,\infty)$ is said to induce $P$ if for every $u,v\in V$ the path $P_{u,v}$ is $w$-geodesic. We say that $G$ is metrizable if every consistent path system is induced by some such $w$. As we show, metrizable graphs are very rare, whereas there exist infinitely many 2-connected metrizable graphs.
We consider a chain of oscillators with one particle in contact with a thermostat at temperature T. The thermostat is modeled by a Langevin dynamics or a renewal of the velocity with a gaussian random variable with variance T. The dynamics of the oscillators is perturbed by a random exchange on velocities between nearest neighbor particles.
The (thermal) energy has a macroscopic superdiffusive behavior governed by a fractional heat equation (i.e. with a fractional Laplacian). The microscopic thermostat impose a particular boundary condition to the fractional Laplacian, corresponding to certain probabilities of transmission/reflection/absorption/creation for the corresponding superdiffusive Levy process.
This is from a series of works in collaboration with Tomazs Komorowski, Lenya Ryzhik, Herbert Spohn.
We will survey an analogy between random integers and random permutations, which goes back to works of Erdős and Kac and of Billingsley.
This analogy inspired results and proofs about permutations, originating in the setting of integers, and vice versa.
Extensions of this analogy will be described, involving the generalized Ewens measure on permutations, based on joint work with D. Elboim.
If time permits, an analogous analogy, this time between random polynomials over a finite field and random permutations, will be discussed and formalized, with some applications.
For an ordinary commutative Noetherian ring R we would define the singularity category to be the quotient of the (derived category of) finitely generated modules modulo the (derived category of) fg projective modules [``the bounded derived category modulo compact objects’’]. For a ring spectrum like C^*(BG) (coefficients in a field of characteristic p) it is easy to define the module category and the compact objects, but finitely generated objects need a new definition. The talk will describe the definition and show that the singularity category is trivial exactly when G is p-nilpotent. We will go on to describe the singularity category for groups with cyclic Sylow p-subgroup.
The idea of assigning weights to local coordinate functions is used in many areas of mathematics, such as singularity theory, microlocal analysis, sub-Riemannian geometry, or the theory of hypo-elliptic operators, under various terminologies. In this talk, I will describe some differential-geometric aspects of weightings along submanifolds. This includes a coordinate-free definition, and the construction of weighted normal bundles and weighted blow-ups. As an application, I will describe a canonical local model for isotropic embeddings in symplectic manifolds. (Based on joint work with Yiannis Loizides.)
I will discuss a class of generalized global symmetries, which we call “Chern-Weil global symmetries,” that arise ubiquitously in gauge theories. The Noether currents of these Chern-Weil global symmetries are given by wedge products of gauge field strengths and their conservation follows from Bianchi identities, so they are not easy to break. However, exact global symmetries should not be allowed in a consistent theory of quantum gravity. I will explain how these symmetries are typically gauged or broken in string theory. Interestingly, many familiar phenomena in string theory, such as axions, Chern-Simons terms, worldvolume degrees of freedom, and branes ending on or dissolving in other branes, can be interpreted as consequences of the absence of Chern-Weil symmetries in quantum gravity, suggesting that they might be general features of quantum gravity.
In this session, Ben Fehrman and Markus Upmeier will give their thoughts on how to deliver a good talk for a conference or a seminar and tips for what to do and what to avoid. There will be a particular emphasis on how to give a good talk online.
Crystal Structure Prediction aims to reveal the properties that stable crystalline arrangements of a molecule have without stepping foot in a laboratory, consequently speeding up the discovery of new functional materials. Since it involves producing large datasets that themselves have little structure, an appropriate classification of crystals could add structure to these datasets and further streamline the process. We focus on geometric invariants, in particular introducing the density fingerprint of a crystal. After exploring its computations via Brillouin zones, we go on to show how it is invariant under isometries, stable under perturbations and complete at least for an open and dense space of crystal structures.
The well-known Schur-Weyl duality provides a link between the representation theories of the general linear group $GL_n$ and the symmetric group $S_r$ by studying tensor space $(\mathbb{C}^n)^{\otimes r}$ as a ${(GL_n,S_r)}$-bimodule. We will discuss a few variations of this idea which replace $GL_n$ with some other interesting algebraic object (e.g. O$_n$ or $S_n$) and $S_r$ with a so-called diagram algebra. If time permits, we will also briefly look at how this can be used to define Deligne's category which 'interpolates' Rep($S_t$) for any complex number $t \in \mathbb{C}$.
Human life expectancy has been increasing steadily over the last century but this has resulted in an increasing incidence of age-related chronic diseases. Over 60% of people over the age of 65 will suffer from more than one disease at the same time (multimorbidity) and 25-50% of those over 80 years old develop frailty, defined as an accumulation of deficits and loss of reserve. Multimorbidity and frailty have complex medical needs and are strongly associated with disability and hospitalization. However, current treatments are suboptimal with problems of polypharmacy due to the fact that each disease is treated individually. Geroprotectors target fundamental mechanisms of ageing common to multiple age-related diseases and shows promise in delaying the onset of multimorbidity and frailty in animal models. However, their clinical testing in patients has been challenging due to the high level of complexity in the mode of action of geroprotectors and in the way multimorbidity and frailty develop.
The talk will give an overview of these problems and make the case for the use of AI approaches to solve some of those complex issues with a view of designing appropriate clinical trials with geroprotectors to prevent age-related multimorbidity and frailty and extend healthspan.
The interplay between fluid flows and fractures is ubiquitous in Nature and technology, from hydraulic fracturing in the shale formation to supraglacial lake drainage in Greenland and hydrofracture on Antarctic ice shelves.
In this talk I will discuss the above three examples, focusing on the scaling laws and their agreement with lab experiments and field observations. As climate warms, the meltwater on Antarctic ice shelves could threaten their structural integrity through propagation of water-driven fractures. We used a combination of machine learning and fracture mechanics to understand the stability of fractures on ice shelves. Our result also indicates that as meltwater inundates the surface of ice shelves in a warm climate, their collapse driven by hydrofracture could significantly influence the flow of the Antarctic Ice Sheets.
Recent advances in computing power and the potential to make more realistic assumptions due to increased flexibility have led to the increased prevalence of simulation models in economics. While models of this class, and particularly agent-based models, are able to replicate a number of empirically-observed stylised facts not easily recovered by more traditional alternatives, such models remain notoriously difficult to estimate due to their lack of tractable likelihood functions. While the estimation literature continues to grow, existing attempts have approached the problem primarily from a frequentist perspective, with the Bayesian estimation literature remaining comparatively less developed. For this reason, we introduce a widely-applicable Bayesian estimation protocol that makes use of deep neural networks to construct an approximation to the likelihood, which we then benchmark against a prominent alternative from the existing literature.
Contact organisers for access to meeting (Carmen Jorge-Diaz, Connor Behan or Sujay Nair)