News

Friday, 25 September 2020

Lifting the Newfoundland Travel Ban - a Story of Mathematics, Coronavirus and the Law

When Oxford Mathematician Alain Goriely was approached by his collaborator Ellen Kuhl from Stanford University to work on a travel restriction issue in Newfoundland he started a Coronavirus journey that ended up in the Canadian Supreme Court.

"The island of Newfoundland is part of the Canadian province of Newfoundland and Labrador. Following a travel ban on May 5, 2020, this Atlantic province enjoyed the rather exceptional and enviable position of having the Coronavirus pandemic under control. By July 3, 2020, it had a cumulative number of 261 cases, with 258 recovered, 3 deaths, and no new cases for 36 days. The same day, the Atlantic Bubble opened to allow air travel between the four Atlantic Provinces - Newfoundland and Labrador, Nova Scotia, New Brunswick, and Prince Edward Island - with no quarantine requirements for travellers. With respect to COVID, the inhabitants of the province are in a dangerous position as they have the highest rates of obesity, metabolic disease, and cancer nationally, and an unhealthy lifestyle with the highest rate of cigarette smoking among all provinces. Despite its success in eliminating the virus, the government found itself in a precarious position. Its travel ban, Bill 38, was being challenged by a Halifax resident who was denied entry for her mother’s funeral in the Spring and the lawsuit was further supported by the Canadian Civil Liberties Association. They were seeking a declaration from the provincial Supreme Court in St John’s that the travel ban was unconstitutional, a decision that could apply to the entire country. Determined to keep control of its borders, the Office of the Attorney-General reached out to Ellen. Would her models be applicable to this situation? What would happen during gradual or full reopening under perfect or imperfect quarantine conditions?

Ellen and I had been talking about a hypothetical problem like this one. If the virus is eliminated from a region, can it come back, like a boomerang, when restrictions are eased? Newfoundland seemed to be the perfect case study for us, and with the help of her outstanding Postdoc, Kevin Linka and Dr Proton Rahman, a clinical epidemiologist and Professor of Medicine at Memorial University of Newfoundland, we jumped at the opportunity to test some of our ideas. Soon, we converged on a network model where each node represents a US state or a Canadian province. On each node, we ran a local Suscetible-Exposed-Infected-Recovered epidemiological model and modelled air traffic by a graph Laplacian-type transport process as commonly done for network transport. Parameters were estimated by Bayesian inference with Markov-chain Monte Carlo sampling using a Student’s t-distribution for the likelihood between the reported cumulative case numbers and the simulated cumulative case numbers.

Conceptually, the model is quite simple. I have a natural preference for parsimony when it comes to modelling complex phenomena as the assumptions are completely known and in full display. This is a personal choice and the outcomes of such models should be seen as estimates rather than a hardcore forecast. What we found is quite interesting. Using air traffic information from the previous 15 months, we showed that opening Newfoundland to the Atlantic provinces or the rest of Canada would have negligible effects on the evolution of the disease as prevalence dropped considerably in Canada. Yet, opening the airports to the USA would lead to 2-5 infected passengers entering the island a week, with as many as 1-2 asymptomatic travellers. Without an air-tight quarantine system, the disease would reach 0.1% of the Newfoundland population within 1 to 2 months.

In the first week of August, evidence were presented to the court. The Chief Medical Officer of Health Dr. Janice Fitzgerald opened with the following quote: “In 1775 the American revolutionary Patrick Henry declared, ’Give me liberty or give me death.’ In this case, if the applicants’ remedy is granted, it will result in both.” The same week Proton testified in court about our model, its assumptions, and our findings. To my surprise, the scientists were heard and on 17 September, the judge rendered his verdict. In his ruling, Justice Burrage declared that ‘‘The upshot of the modelling ... is that the travel restriction is an effective measure at reducing the spread of COVID19 in Newfoundland and Labrador.” He concluded that yes, the ban was legal and justified. Having an impact on the lives of Newfoundlanders, however small, is a strange but rather pleasant feeling."

Read more about the case here.

Figures above:

1. Mobility modelling. Discrete graphs of the Atlantic Provinces of Canada and of North America with 4, 13, and 64 nodes that represent the main travel routes to Newfoundland and Labrador. Dark blue edges represent the connections from the Atlantic Provinces, light blue edges from the other Canadian provinces and territories, and red edges from the United States.

2. Estimated COVID-19 infectious travellers to Newfoundland and Labrador. Number of daily incoming air passengers from the Canadian provinces and territories and the United States that are infectious with COVID-19.

Thursday, 17 September 2020

Re-examining geometry: p-adic numbers and perfectoid spaces

A tower of modular curves

Oxford Mathematician Daniel Gulotta talks about his work on $p$-adic geometry and the Langlands program.

"Geometry is one of the more visceral areas of mathematics. Concepts like distance and curvature are things that we can actually see and feel.

Mathematicians like to question things that seem obvious. For example, what if distance did not work the way that we are used to? Normally, we would say that a fraction like $\frac{1}{65536}$ is very close to zero because the denominator is much larger than the numerator, and conversely $65536 = \frac{65536}{1}$ is very far from zero because the numerator is much larger than the denominator. In $p$-adic geometry, we instead choose a prime number $p$, and we say that a fraction $\frac{a}{b}$ is close to zero if $a$ is divisible by a large power of $p$. So for $p=2$, $65536 = \frac{2^{16}}{1}$ is actually $2$-adically very close to zero and $\frac{1}{65536} = \frac{1}{2^{16}}$ is $2$-adically very far from zero. It can be shown that the only 'well-behaved' notions of distance on the rational numbers are the usual one and the $p$-adic ones.

It is difficult to directly relate our intuition about the real world to this $p$-adic notion of distances. One of the challenges of $p$-adic geometry is to find ways of framing familiar concepts so that they still make sense in the $p$-adic world.

A common technique in geometry is to study a complicated space by chopping it into smaller, simpler pieces and studying how those pieces fit together. For example, a complicated surface could be divided into discs.

A surprising fact about $p$-adic geometry is that the $p$-adic disc is still a very complicated space - it loops back on itself in all sorts of ways. It turns out that the simpler spaces are actually these very large objects called perfectoid spaces. Whereas one can specify a point on the $p$-adic disc with a single $p$-adic coordinate (similar to how a point on the usual disc can be specified by a single complex number), a point on a perfectoid space might be specified by an infinite sequence of coordinates $(x_0,x_1,x_2,\dotsc)$ satisfying $x_i = x_{i+1}^p$ for all $i \ge 0$. So to understand $p$-adic spaces, in addition to cutting them into pieces, we also want to cover those pieces by perfectoid spaces.

I am particularly interested in applications of $p$-adic geometry to the Langlands program, which explores connections between number theory, geometry, and representation theory.  One way that geometry enters into the Langlands program is through Shimura varieties. These are geometric spaces whose symmetries have a particularly nice arithmetic description.  (Modular curves, a particular kind of Shimura variety, play a significant role in Andrew Wiles's famous proof of Fermat's last theorem.)

Along with Ana Caraiani and Christian Johansson, I have been studying the $p$-adic geometry of Shimura varieties. In particular, we have proved vanishing of certain compactly supported cohomology groups.  In plain terms, we have shown that these Shimura varieties do not have any loops of high dimensions.  This result has allowed us to improve on Scholze's results on the p-adic Langlands correspondence. The proof involves using techniques from geometric representation theory, namely Weyl groups and the Bruhat stratification, to divide the Shimura variety into pieces and then cover those pieces with perfectoid spaces."

For more on Daniel's work:

Wednesday, 16 September 2020

Frontiers of secrecy - the story of Eve, Alice and Bob

Oxford Mathematician Artur Ekert describes how his research in to using Quantum properties for cryptography led to some very strange results.

"The most secure methods of communication rely on pre-distributed, random and secret sequences of bits, known as cryptographic keys. Any two parties who share the key, we call them Alice and Bob (not their real names, of course), can then use it to communicate secretly. The key bits must be truly random, never reused, and securely delivered to Alice and Bob, which is by no means easy for Alice and Bob may be miles apart. Still, it can be done. About thirty years ago my work triggered an active field of research by showing that quantum entanglement and peculiar non-local quantum correlations can be used for secure key distribution. More recently, building upon this work, cryptologists started probing the ultimate limits of security and showed that what looks like an insane scenario is actually possible - devices of unknown or dubious provenance, even those that are manufactured by our enemies, can be safely used to generate secure keys.

In this more dramatic version, known as the device independent scenario, an omniscient adversary, called Eve, is in charge of the key distribution. She prepares two sealed and impregnable devices, and gives one to Alice and one to Bob. The inner working of the devices is unknown to Alice and Bob but they can take them to their respective locations and probe them with randomly and independently chosen binary inputs to which the devices respond with binary outputs. For Alice's and Bob's inputs and y their devices generate outputs a and b, respectively. If in the repetitive use the responses of the two devices show a certain pattern then the output bits can be turned into a secret key.

At first this narrative makes no sense. Surely, if Eve manufactured the two devices she must also have pre-programmed them, and hence she would know how they respond to all possible inputs, which, of course, renders the resulting keys insecure. Surprisingly enough, if Alice and Bob prepare their inputs freely, so that Eve does not know in advance which inputs will be chosen in each run, and the devices are kept separated and incommunicado, then some patterns of outputs cannot be pre-programmed. For example, requesting that in each run the outputs are identical for all inputs except when Alice prepares input 1 and Bob prepares input 1, in which case the outputs must be different, that is \[ x y=a\oplus b, \] defeats the coding prowess of any, no matter how powerful, Eve. The best Eve can do in this case is to have this request satisfied in 75% runs of the devices. Anything more than 75% indicates outputs that are not pre-programmed or pre-determined and hence unknown to any third party, Eve included. No classical devices can deliver such a correlated performance but quantum devices can, pushing the success rate to roughly 86%. Thus, if Eve is prepared to offer Alice and Bob quantum devices that beat the 75% classical limit she has to concede that at least some of the outputs bits will be unknown to her. Alice and Bob infer how much Eve knows about the output bits from the success rate and then, if Eve does not know too much, they can use conventional cryptographic tools to turn the partially secret bits into fewer secret ones, which will then form a cryptographic key.

The key distribution proceeds as follows. Alice and Bob run their devices choosing their inputs randomly, and independently from each other. In each run, once the outputs are registered, Alice and Bob communicate in public, with Eve listening, and reveal their inputs, but not the outputs. The repetitive use of the devices results in two binary output strings, one held by Alice and one by Bob. In a subsequent public communication, again with Eve listening, Alice and Bob agree on a random sample of the recorded runs for which they reveal the outputs. This allows them to estimate the success rate on the data in the fully disclosed runs. If the estimated success rate is below 75% the key distribution is abandoned, otherwise Alice and Bob turn the remaining undisclosed outputs into a secret key.

Needless to say, proving security under such weak assumptions, with all the nuts and bolts, is considerably more challenging than in the case of trusted devices. Between the two extremes - at the success rate of 75% Eve may know everything about the key and at 86% Eve knows nothing - lies an important uncharted twilight zone. Suppose the success rate is 80% then what? We must put an upper bound on Eve's knowledge for any intermediate success rate, as this determines how Alice and Bob turn their raw output strings into a shorter but secret key. This is not easy.

The important technical figure of merit to be evaluated here is called (just in case you want to impress your friends with the mathematical lingo) "the conditional smooth min-entropy", and it should be expressed as a function of the success rate. This quantity determines the length of the final key that can be distilled from a given raw output, but deriving it required considerable mathematical gymnastics. The early results provided an explicit expression for the asymptotic key rate (the ratio between the distillable key length and the length of raw output in the limit of a large number of runs), but the analysis applied only to cases where the devices behaved in the same way in each run, unaffected by the previous or subsequent runs. This is known as the independent and identically distributed (i.i.d.) assumption. At the time even the most fervent advocates of the device independent cryptography had to admit that the result, as neat as it was, had no direct bearing on the device independent scenario, for Eve can manufacture the devices as she sees fit, making successive outputs dependent on what happened in all the previous runs. More recently, the most general case has been finally worked out. It turns out that a realistic device independent scenario can be reduced to the i.i.d. case, showing that Eve cannot do better than making each run of the devices independent from and statistically identical to all other runs. This is of great comfort to both mathematicians and experimentalists for it shows that reasonable key rates can be achieved even in noisy implementations.

One should perhaps mention that the device independent scenario does not have to involve untrusted devices. The protocol is more likely to offer an additional security layer to quantum cryptography with trusted devices, protecting against attacks that exploit unintentional flaws in the design. What is crucial here, however, is the assumption that Alice and Bob prepare their respective inputs randomly and independently from each other. If Alice's and Bob's choices were known in advance then Eve can easily pre-program the results and Alice and Bob would foolishly believe that they generated a secret key. However, as long as Alice's and Bob's choices are their own, unknown and unpredictable, then the most mind-boggling cryptographic scheme ever proposed works just fine and can see the light of the day sooner than many of us expected."

 

 

Monday, 14 September 2020

Local Topology for Anomaly Detection in Data

Oxford Mathematician Vidit Nanda discusses his recent work with colleagues Bernadette Stolz, Jared Tanner and Heather Harrington on detecting singularities in data.

"Fitting geometric models to high-dimensional point clouds plays an essential role in all sorts of tools in contemporary data analysis, from linear regression to deep neural networks. By far the most common and well-studied geometric models are manifolds. For instance, the plane, sphere and torus illustrated below are all two-dimensional manifolds that can be embedded in three-dimensional Euclidean space.

                                            

There is a local test which characterizes d-dimensional manifolds: around each point, can you find a small region which resembles standard d-dimensional Euclidean space? If the answer is yes, then you have a d-manifold on your hands. Thus, if we were to zoom in with a very high-powered microscope at any point on the sphere or torus, we would approximately see a plane. 

When confronting heterogeneous data which comes from several different sources or measurements, it may no longer be feasible to expect that a single underlying manifold will provide a good fit to all the data points. This is because the union of two d-dimensional manifolds (such as a sphere and a plane) along a shared sub-manifold (such as an equator) will not itself be a d-manifold. If you examine small neighbourhoods around points in the common sub-manifold, you can see that they will fail the local test for manifold-ness. For instance, if you zoom in on any point lying in the equator of the figure below, you will see two planes that intersect along a common line rather than a single plane.

                                                                    

Such non-manifold spaces which are built out of manifold pieces are called stratified spaces, and their non-manifold regions, such as the equator in the example above, are called singularities. The study of stratified spaces has been a long and fruitful enterprise across several disciplines in pure mathematics, including algebraic geometry, algebraic topology and representation theory. 

We have recently developed a framework to automatically detect singularities directly from data points even when none of the data points lie exactly on the singular regions. One key advantage is that we are now able to partition a heterogeneous dataset into separate clusters based on their intrinsic dimensionality. Thus, a dataset living on the plane-plus-sphere described above would be decomposed into five clusters, one of which lives near the one-dimensional singular equator, while the other four lie on various parts of the sphere or the plane. 

                                             

All five pieces are manifolds, so the standard manifold-fitting techniques which are pervasive in data science can be safely applied to them individually.

The key technique in this singularity-detection framework is persistent cohomology, which assigns a family of combinatorial invariants called barcodes, one for each dimension, to a collection of data points. If these points have been sampled densely from a d-dimensional sphere, then there is a prominent bar in the d-th persistent cohomology barcode and not much else.  Given each point p in the dataset, one examines the set of all annular neighbours - this consists of all data points q satisfying α < dist(p,q) < β for some small positive distances α and β, where dist denotes Euclidean distance. By the local manifold property, the central point p lies on a d-dimensional manifold, then for suitably small α < β the set of all annular neighbours will live approximately on a (d-1)-dimensional sphere, which can be detected accurately by the presence of a single dominant bar in the (d-1)-st persistent cohomology barcode. 

We tried this technique on a dataset whose points correspond to configurations of a molecule called cyclo-octane. The data consists of 5000 points in 24-dimensional space, and the points live on the intersection of two embedded surfaces along two circles. As expected, points lying near the two singular circles are easily identified by their local persistent cohomology barcodes. These special points are coloured red in the 2-dimensional projection of the data below."

                                                                       

 

Wednesday, 2 September 2020

COVID-19 incidence is inversely proportional to T-cell production

One of the great puzzles of the current COVID-19 crisis is the observation that older people have a much higher risk of becoming seriously ill. While it is usually commonly accepted that the immune system fails progressively with age, the actual mechanism leading to this effect was not fully understood. In a recent work, Sam Palmer from Oxford Mathematics and his colleagues in Cambridge have proposed a simple and elegant solution to this puzzle. They focussed their attention on the thymus where T-cells, partially responsible for the body’s immune response, develop. Observational data show that the thymus shrinks in time, losing about 4.5% of its volume every year in adulthood. Remarkably, this decay correlates with the increase in risk with age. Indeed, many infectious diseases and cancer types have risk profiles that rise by the same 4.5% every year - that’s an exponential increase with a doubling time of 16 years. In their paper, they showed that COVID-19 hospitalisations follow the same trend with an increase of about 4.5% per year between age groups, suggesting that the main effect may be due to thymic function.

Another puzzle emerging from the data is that men have a systematic greater risk of hospitalisation and death. Again, the authors show that the answer may lie in the thymus as it is known that men have lower T-cell production.

What about the children who, thankfully, have been mostly spared? It turns out that the immune system for children under 20 years of age is very different than the one found in adults. It does not follow the same law of exponential decrease. The statistical analysis of this younger cohort shows that they are as likely to get infected, but they have a much lower probability of disease progression than what would be predicted from strong thymus function alone. A possible explanation of this observation is that this age group may be more protected due to cross-protection from common cold viruses, which they get more often than adults.

This research tying observational data with mechanistic models of the immune system is crucial in our understanding of COVID-19 and in our quest for therapeutic targets. Find out more about this work which was carried out with Ruairi Donnelly and Nik Cunniffe from University of Cambridge.

Tuesday, 1 September 2020

Oxford Mathematician and Fantasy Football winner kicks off the new Public Lecture season

The Premier League football season starts on 12 September and that means so does the Fantasy Premier League. So how are you going to play it this time? Need some tips? Joshua Bull from Oxford Mathematics won last season’s competition from nearly 8 million entrants. He kicks off the new Oxford Mathematics Public Lecture Season by telling you how. 

Fantasy Football is played by millions of people worldwide, and there are countless strategies that you can choose to try to beat your friends and win the game. But what’s the best way to play? Should you be patient and try to grind out a win, or are you better off taking some risks and going for glory? Should you pick players in brilliant form, or players with a great run of fixtures coming up? And what is this Fantasy Football thing anyway?

As with many of life’s deep questions, maths can help us shed some light on the answers. We’ll explore some classic mathematical problems which help us understand the world of Fantasy Football. We’ll apply some of the modelling techniques that mathematicians use in their research to the problem of finding better Fantasy Football management strategies. And - if we’re lucky - we’ll answer the big question: Can maths tell us how to win at Fantasy Football?

Joshua Bull is a Postdoctoral Research Associate in the Mathematical Institute in Oxford and the winner of the 2019-2020 Premier League Fantasy Football competition.

Watch live (no need to register):
https://twitter.com/OxUniMaths
https://www.facebook.com/OxfordMathematics/
https://livestream.com/oxuni/bull
Oxford Mathematics YouTube Channel

The Oxford Mathematics Public Lectures are generously supported by XTX Markets.

Wednesday, 12 August 2020

Richard Wade and Erik Panzer awarded Royal Society University Research Fellowships

Oxford Mathematicians Richard Wade and Erik Panzer have been awarded Royal Society University Research Fellowships for 2020. The Research Fellowship scheme was established to identify outstanding early career scientists who have the potential to become leaders in their chosen fields and provide them with the opportunity to build an independent research career.

Ric's main research area is geometric group theory, particularly the study of free groups and their automorphisms. He's interested in invariants of groups coming from topology (like cohomology) and rigidity problems. He also looks at trees and their deformation spaces.

Erik's research interests cover the mathematics of perturbative quantum (field) theory, in particular Feynman integrals, deformation quantization and resummation.

 

Tuesday, 11 August 2020

Oxford Mathematician Josh Bull wins Fantasy Football Premier League (out of 8 million entrants)

You are an Ipswich Town fan, so you need some fantasy in your life (they are not very good just now for those of you who are not football fans). Oxford Mathematician Josh Bull is an Ipswich fan. So he entered the Fantasy Football Premier League along with 8 million others, some of whom might even have been mathematicians. 

Result? He won. 

His secret. Well, yes he is a mathematician, but his real secret was not to choose any players from Ipswich's local rivals Norwich. It worked. Norwich came bottom of the real Premier League.

Watch out soon for Josh's Oxford Mathematics Public Lecture on the best strategies for Fantasy Football success.

Friday, 7 August 2020

James Maynard elected to Academia Europaea

Oxford Mathematicians James Maynard has been elected to Academia Europaea. He joins 13 other Oxford Mathematicians in the Academy which boasts 4000 members and 70 Nobel laureates. The Academy seeks the advancement and propagation of excellence in scholarship in the humanities, law, the economic, social, and political sciences, mathematics, medicine, and all branches of natural and technological sciences anywhere in the world for the public benefit and for the advancement of the education of the public of all ages in the aforesaid subjects in Europe.

Still only 33, James Maynard is one of the brightest stars in world mathematics at the moment, having made dramatic advances in analytic number theory in recent years. A recent interview in Quanta Magazine delves in to James's work and his thinking.

Tuesday, 4 August 2020

Bryan Birch awarded the Royal Society's Sylvester Medal for 2020

Oxford Mathematician Bryan Birch has been awarded the Royal Society's Sylvester Medal for 2020 for his work in driving the theory of elliptic curves through the Birch--Swinnerton-Dyer conjecture and the theory of Heegner points. The Birch--Swinnerton-Dyer conjecture is one of the Clay Mathematics Institute Millennium Problems.

The Sylvester Medal is awarded annually for an outstanding researcher in the field of mathematics. The award was created in memory of the mathematician James Joseph Sylvester FRS who was Savilian Professor of Geometry at the University of Oxford in the 1880s. It was first awarded in 1901. The medal is of bronze, is now awarded annually and is accompanied by a gift of £2,000. 

Bryan Birch was educated at Trinity College, Cambridge where as a doctoral student he proved Birch's theorem, one of the results to come out of the Hardy–Littlewood circle method; it shows that odd-degree rational forms in a large enough set of variables must have zeroes.

He then worked with Peter Swinnerton-Dyer on computations relating to the Hasse–Weil L-functions of elliptic curves. They formulated their conjecture relating the rank of an elliptic curve to the order of a certain zero of an L-function; it has been an influence on the development of number theory since the mid 1960s. They later introduced modular symbols. 

In later work he contributed to algebraic K-theory (Birch–Tate conjecture). He then formulated ideas on the role of Heegner points (he had been one of those reconsidering Kurt Heegner's original work, on the class number one problem, which had not initially gained acceptance). Birch put together the context in which the Gross–Zagier theorem was proved. He was elected a Fellow of the Royal Society in 1972; was awarded the Senior Whitehead Prize in 1993 and the De Morgan Medal in 2007. In 2012 he became a fellow of the American Mathematical Society.

Pages