Thursday, 5 October 2017 
Dame Frances Kirwan has been elected to the Savilian Professorship at the University of Oxford. Frances will be the 20th holder of the Savilian Chair (founded in 1619), and is the first woman to be elected to any of the historic chairs in mathematics.
Frances has received many honours including being elected a Fellow of the Royal Society in 2001 (only the third female mathematician to attain this honour), and President of the London Mathematical Society from 20032005 (only the second female ever elected).
Frances' specialisation is algebraic and symplectic geometry, notably moduli spaces in algebraic geometry, geometric invariant theory (GIT), and the link between GIT and moment maps in symplectic geometry.

Wednesday, 4 October 2017 
Oxford Mathematics in partnership with the Science Museum is delighted to announce its first Public Lecture in London. Worldrenowned mathematician Andrew Wiles will be our speaker. Andrew will be talking about his current work and will also be in conversation with mathematician and broadcaster Hannah Fry after the lecture. Attendance is free.
28th November, 6.30pm, Science Museum, London, SW7 2DD
Please email externalrelations@maths.ox.ac.uk to attend.

Tuesday, 3 October 2017 
Oxford Mathematician PerGunnar Martinsson has been awarded the 2017 Germund Dahlquist Prize by the Society for Industrial and Applied Mathematics. The Germund Dahlquist Prize is awarded for original contributions to fields associated with Germund Dahlquist, especially the numerical solution of differential equations and numerical methods for scientific computing.
The prize honors Martinsson for fundamental contributions to numerical analysis and scientific computing that are making a significant impact in data science applications. Specific contributions include his development of linear time algorithms for dense matrix operations related to multidimensional elliptic PDEs and integral equations; and he has made deep and innovative contributions to the development of probabilistic algorithms for the rapid solution of certain classes of largescale linear algebra problems.
PerGunnar is currently Professor of Numerical Analysis at the University of Oxford. Hear more from him in this Q & A.

Monday, 2 October 2017 
QBIOX – Quantitative Biology in Oxford – is a new network that brings together biomedical and physical scientists from across the University who share a commitment to making biology and medicine quantitative. A wide range of bioscience research fields are interested in the behaviour of populations of cells: how they work individually and collectively, how they interact with their environment, how they repair themselves and what happens when these mechanisms go wrong. At the cell and tissue levels, similar processes are at work in areas as diverse as developmental biology, regenerative medicine and cancer, which means that common tools can be brought to bear on them.
QBIOX’s focus is on mechanistic modelling: using maths to model biological processes and refining those models in order to answer a particular biological question. Researchers now have access to more data than ever before, and using the data effectively requires a joinedup approach. It is this challenge that has encouraged Professors Ruth Baker, Helen Byrne and Sarah Waters from the Mathematical Institute to set up QBIOX. The aim is to help researchers with the necessary depth and range of specialist knowledge to open up new collaborations, and share expertise and knowledge, in order to bring about a stepchange in understanding in these areas. In regenerative medicine, for example, QBIOX has brought together a team of people from across the sciences and medical sciences in Oxford who are working on problems at the level of basic stem cell science right through to translational medicine that will have real impacts on patients.
A look at the list of QBIOX collaborators demonstrates that Oxford researchers from a wide range of backgrounds are already involved: from maths, statistics, physics, computer science and engineering, through to pathology, oncology, cardiology and infectious disease. QBIOX is encouraging any University researcher with an interest in quantitative biology to join the network. It runs a programme of activities to catalyse interactions between members. For example, QBIOX’s termly colloquia offer opportunities for academics to showcase research that is of interest to network members, and there are regular smaller meetings that look in detail at specific topics. QBIOX also has funding for researchers who would like to run small meetings to scope out the potential for using theoretical and experimental techniques to tackle new problems in the biosciences.
The QBIOX website has details of all the activities run by the network, as well as relevant events taking place across the University. If you have events you would like to feature here, just complete the contact form. You can also sign up to be a collaborator and to receive QBIOX’s termly newsletter.

Sunday, 1 October 2017 
Oxford Mathematician Dmitry Belyaev is interested in the interface between analysis and probability. Here he discusses his latest work.
"There are two areas of mathematics that clearly have nothing to do with each other: projective geometry and conformally invariant critical models of statistical physics. It turns out that the situation is not as simple as it looks and these two areas might be connected.
We start with projective geometry. Let $g(x):\mathbb{R}^{m+1} \to \mathbb{R}$ be a homogeneous polynomial of degree $n$ in $m + 1$ variables. Although the values of the polynomial are not well defined in homogeneous coordinates $[x_0 : x_1 : \dotsm : x_m]$, but the zero locus, the set where $g([x_0 : x_1 : \dotsm : x_m]) = 0$, is well defined. The set $S = \{x ∈ \mathbb{PR}^m : g(x) = 0\}$ is a projective variety.
We can ask what a typical projective variety looks like. The answer to this question very much depends on the meaning of the word ‘typical’. One possibility is to define some ‘natural’ probability measure on the space of all homogeneous polynomials $g$ and treat ‘typical’ behaviour as almost sure behaviour with respect to this measure. Since the space of polynomials is too large, there is no canonical way to define the most natural uniform measure. Second best choice is a Gaussian measure. This still does not completely determine the measure, but there is one Gaussian measure which stands out: this is the only Gaussian measure which is the real trace of a complex Gaussian measure on space of homogeneous polynomials on $\mathbb{CP}^m$ which is invariant with respect to the unitary group. A random polynomial of degree n with respect to this measure could be written as
$$f_n(x) = f_{n;m}(x) = \sum_{J=n}\sqrt{\binom{n}{J}} a_J x^J,$$
where $J = (j_0, . . . , j_m)$ is the multiindex, $J = j_0 + \dotsb + j_m$, $\binom{n}{J} = \frac{n!}{j_0! \dotsb j_m!}$, and $\{a_J\}$ are i.i.d. standard Gaussian random variables. This random function is called the Kostlan ensemble or complex FubiniStudy ensemble. We can think that a ‘typical’ variety of degree $n$ is the nodal set of the Kostlan ensemble of degree $n$. We are mostly interested in the twodimensional case $m = 2$.
It has been shown by V. Beffara and D. Gayet that there is RussoSeymourWelsh type estimate for BargmannFock random function which is the scaling limit of the Kostlan ensemble. This means that if one fixes a nice domain with two marked boundary arcs, then the probability that there is a nodal line connecting two arcs inside the domain is bounded from below by a constant which depends on the shape of the domain, but not on its scale. These types of estimates first appeared in the study of critical percolation models and are a strong indication that the corresponding curves have conformally invariant scaling limits.
In the recent work with S. Muirhead and I. Wigman we have extended this result to the Kostlan ensemble on the sphere. Namely, we have obtained a lower bound on the probability to cross a domain which is uniform in the degree of the polynomial and in the scale of the domain. This suggests that large components of a ‘typical’ projective curve have a scaling limit which is conformally invariant and should be described by the SchrammLoewner Evolution."
For a fuller explanation of Dmitry and colleagues' work please clck here.

Monday, 25 September 2017 
As part of our series of research articles focusing on the rigour and intricacies of mathematics and its problems, Oxford Mathematician Andrew Dancer discusses his work on Ricci Flow.
"A sphere and an ellipsoid (rugby ball) are the same topologically, in that each can be continuously deformed into the other without tearing, but obviously they are not the same geometrically. We can see that the sphere is in some sense uniformly curved, while the curvature of the ellipsoid varies as we move around the surface.
The mathematical gadget that encodes information about curvature, lengths, angles, volumes etc. is called a metric. This concept in fact makes sense not just for surfaces but in higher dimensions as well. The curvature is now not a single function but an object called the Riemann curvature tensor.
A fundamental question in geometry is whether a given manifold has a best or nicest metric. One popular candidate is the notion of an Einstein metric. The equations expressing the Einstein condition are a complicated nonlinear system of partial differential equations, and questions about existence and uniqueness of Einstein metrics in dimensions above three are still not well understood in general.
One strategy to study the existence of Einstein metrics is via the Ricci flow. This is a nonlinear version of heat flow, whose fixed points correspond to Einstein metrics, rather as fixed points of heat flow correspond to harmonic functions (solutions of Laplace's equation). In good situations the Ricci flow may converge to an Einstein metric, but it is also possible for singularities to develop, arising from the nonlinear nature of the flow. I am particularly interested in socalled soliton solutions of the heat flow, corresponding to metrics that evolve just by rescaling and coordinate changes under the flow. These give a natural generalisation of the Einstein condition, and are also very important in understanding singularities of the flow via a rescaling of
variables.
In collaboration with Mckenzie Wang of McMaster University in Canada, I have produced new examples of solitons by assuming the existence of a large enough symmetry group to reduce the PDEs to ordinary differential equations. With my student Alejandro Betancourt de la Parra, we have even found some cases where the soliton equations may be solved explicitly, due to unexpected integrability structures in certain dimensions."
For fuller explanations of Andrew's work please click on the links below:
On Ricci solitons of cohomogeneity one
Some New Examples of NonKähler Ricci Solitons
A Hamiltonian approach to the cohomogeneity one Ricci soliton equations
Image above courtesy of Syafiq Johar

Wednesday, 20 September 2017 
We have an exciting series of Oxford Mathematics Public Lectures this Autumn. Summary below and full details here. All will be podcast and on Facebook Live. We also have a London Lecture by Andrew Wiles on 28 November (details will follow separately). Please email externalrelations@maths.ox.ac.uk to register for the lectures below.
Closing the Gap: the quest to understand prime numbers  Vicky Neale
18 October, 5.006.00pm, Lecture Theatre 1, Mathematical Institute, Oxford

Maths v Disease  Julia Gog
1 November, 5.006.00pm, Lecture Theatre 1, Mathematical Institute, Oxford

The Seduction of Curves: The Lines of Beauty That Connect Mathematics, Art and The Nude  Allan McRobie
13 November, 5.006.00pm, Lecture Theatre 1, Mathematical Institute, Oxford

Oxford Mathematics Christmas Public Lecture  Alex Bellos, title tbc
6 December, 5.006.00pm, Lecture Theatre 1, Mathematical Institute, Oxford

Please email externalrelations@maths.ox.ac.uk to register

Thursday, 14 September 2017 
As part of our series of research articles focusing on the rigour and intricacies of mathematics and its problems, Oxford Mathematician David Hume discusses his work on networks and expanders.
"A network is a collection of vertices (points) and edges (lines connecting two vertices). They are used to encode everything from transport infrastructure to social media interactions, and from the behaviour of subatomic particles to the structure of a group of symmetries. A common theme throughout these applications, and therefore of interest to civil engineers, advertisers, physicists, and mathematicians (amongst others), is that it is important to know how well connected a given network is. For example, is it possible that two major road closures make it impossible to drive from London to Oxford? An efficient road network should ensure that there are multiple ways to get between any two important places, but we cannot simply tarmac everything! As another example, if as an advertiser, you post adverts on a social media platform, how do you ensure that you reach as many people as possible, without paying to post to every single account?
Given a network, we say its cut size is the smallest number of vertices you need to remove, so that the remaining pieces have at most half the original number of vertices in them (in our examples: how many roads need to close before half the population are unable to drive to visit the other half, or how many people need to ignore your advert so that less than half of the users of social media will see it).
Let us say that a family of networks, with increasing numbers of vertices, is called an expander if the cut size of each network is proportional to the number of vertices (1) , and each vertex in a network is the end of at most a fixed number of edges. In theory this would be an optimal solution for a transport network as we can connect as many cities as we need to without needing to work out how to manage the traffic lights at a junction where 5,000 roads all converge. In practice, expanders are as incompatible with the geometry of our world as it is possible for any collection of networks to be.
(2)
Expanders, however, are still very interesting and naturally occur in diverse areas: in errorcorrecting codes in computer science; number theory; and in group theory, where my personal interest lies.
It is, in general very difficult to construct a family of expanders, even though randomly choosing larger and larger networks in which every vertex meets exactly three edges will almost surely produce an expander. The first construction of a family was done by Grigory Margulis  they came from the structure networks of finite groups of symmetries. Other constructions have since been found, most notably a construction of Ramanujan graphs (expanders which, in a particular sense, have the largest possible ratio between their cutsize and their number of vertices), and the fantastically named ZigZag product (3) , which builds expanders inductively, starting from two very simple networks.
One question, which seems to have avoided much attention, is the following: how many different expanders are there? To answer this, we first have to deal with the rather sensitive question of what exactly do we mean by different? Does adding one edge change the expander? If so, then the above question is not really very interesting. A more interesting example is provided by Manor Mendel and Assaf Naor: they prove that there are two different expanders so that however you try to associate the vertices in one with the vertices in another, you must either move vertices close that were very far apart before, or else move vertices far apart which previously were very close. In mathematical terms, they are not coarsely equivalent  we cannot even approximately preserve how close vertices are.
In my work, I show that there is a collection of expanders (we can even insist that they are Ramanujan graphs), which is impossible to ennumerate (it is uncountable), such that no pair of them are coarsely equivalent. The technique is to show that for any coarsely equivalent networks, the largest cut size of any network contained in the first with at most n vertices is proportional to the largest cut size of any network contained in the second with at most n vertices. By constructing expanders where these two values are not proportional, we rule out the possibility of such coarse equivalences between them.
The behaviour of cut sizes which is used above to rule out coarse equivalences is of much interest for networks which are not expanders. In my current work I am exploring how cut sizes behave for networks which are 'negatively curved at large scale': this is an area of particular interest in group theory, and plays a key role in the recent proofs of important conjectures in lowdimensional topology: the virtually Haken and virtually fibred conjectures. For such 'negatively curved' groups, cut sizes seem to be related to the dimension of an associated fractal 'at infinity'. With John Mackay and Romain Tessera, we have established this link for an interesting collection of such networks, and are working on developing the technology needed to generalise our results."
(1) This is not the traditional definition, but one of my results proves that a network is an expander in the definition given here if and only it contains an expander in the traditional sense
(2) Two networks with highlighted collections of vertices demonstrating the value of the cut size
(3) The header image of this article is the ZigZag product of a cycle of length 6 with a cycle of length 4

Monday, 11 September 2017 
Medicines are key to disease treatment but are not without risk. Some patients get untoward side effects, some get insufficient relief. The human genome project promises to revolutionise modern healthcare. However, there are 3 billion places where a human’s DNA can be different. Just where are the genes of interest in sufferers of complex chronic conditions? Which genes are implicated the most in which disease in which patients? Which genes are involved in a beneficial response to a medicine? Which genes might be predictive of druginduced adverse events? Collaborative industrial research by Oxford Mathematics' Clive Bowman seeks to tackle these areas to enable drug discovery companies to develop appropriate treatments.
The Royal Society Industrial Fellowship research at the Oxford Centre for Industrial and Applied Mathematics (OCIAM) extends stochastic insights from communication theory into producing easytointerpret visualisations for biotech use. Interacting determinants of the illnesses or adverse syndromes can be displayed as heatmaps or coded networks that highlight potential targets against which chemists can rationally design drugs. All types of measured data can be used simultaneously and dummy synthetic indicators such as pathways or other ontologies can be added for clarity. Heterogeneity is displayed automatically allowing understanding of why some people get a severe disease (or drug response) and others a mild syndrome, as well as other variations, for example due to someone’s ethnicity.
Helped by this mathematics the hope is that the right drug can be designed for the right patient and suffering alleviated efficiently with the minimum risk for the individual. For fuller detail on Clive's work please click here.
The image above shows a drug adverse event example (please click on the image). Clockwise from top left: Drug molecule (by Fvasconcellos); heat map showing patients with severe (red) or mild (blue) syndrome in multidimensional information space (courtesy of Dr O Delrieu); two aetiological subnetworks to syndrome; 3D animation display of results with dummy indicator variables.

Friday, 1 September 2017 
Researchers from Oxford Mathematics and Imperial College London have provided a “'mathematical thought experiment' to inspire caution in biologists measuring heterogeneity in cell populations.
As technologies for gene sequencing and microscopy improve, biologists and biomedical researchers are increasingly able to distinguish heterogeneity in cell populations. And some of these differences in cellular behaviours can have important implications for biological functions, such as stem cells in embryonic development, or invasive malignant cells in the onset of cancer. But where will this trend of looking for heterogeneity lead? With a good enough microscope, every cell may look different. But is this meaningful?
To illustrate their point, Linus Schumacher and Oxford Mathematicians Ruth Baker and Philip Maini focused on an example of heterogeneity in migrating cell populations. They used statistics relating to delays in the correlation between individual cells' movements to examine whether it is possible to infer heterogeneities in cell behaviours. This idea originally stems from analysing the movements of birds, but has since been applied to cells too. By measuring when the movement of two cells (or birds) is most aligned, we learn if cells (or birds) move and turn simultaneously (no delay in correlations), or follow each other (delays in correlations). This is of importance to biologists interested in understanding if a subset of cells is leading metastatic invasion, for example, or the migration of cells in the developing embryo.
Using a minimal mathematical model for cell migration, Schumacher, Baker and Maini show that correlations in movement patterns are not necessarily a good indicator of heterogeneity: even a population of identical cells can appear heterogeneous, due to chance correlations and limited sample sizes. What’s more, when the authors explicitly included heterogeneity in their model to describe experimentally measured data, the model of a homogeneous cell population could describe the data just as well (albeit for different parameter values), heavily limiting what can be concluded from such measurements.
Thus, we have learnt that heterogeneity can naively be inferred from cell tracking data, but it may not be so meaningful. And the implications reach further than a particular type of data and specific statistical analysis. In an associated commentary, Paul Macklin of Indiana University illustrates a corollary of the main work: cell populations that divide with a fixed rate, or a distribution of division rates, can have the same distribution of cell cycle times (which could be measured experimentally). In this case, heterogeneity (whether it is real or not) is unimportant in understanding the observed biological phenomenon.
Lead author Linus Schumacher got the idea for this study while finishing his DPhil at the Wolfson Centre for Mathematical Biology in Oxford, and was enabled to continue working on it through an EPSRC Doctoral Prize award. The research appears on the cover of the August issue of Cell Systems.
