Wednesday, 19 October 2016 
What can fashionable ideas, blind faith, or pure fantasy have to do with the scientific quest to understand the universe? Surely, scientists are immune to trends, dogmatic beliefs, or flights of fancy? In fact, Roger Penrose argues that researchers working at the extreme frontiers of mathematics and physics are just as susceptible to these forces as anyone else.
In this lecture, based on his new book, Roger argues that fashion, faith, and fantasy, while sometimes productive and even essential, may be leading today's researchers astray, most notably in three of science's most important areas  string theory, quantum mechanics, and cosmology.

Tuesday, 18 October 2016 
Many elastic structures have two possible equilibrium states. For example umbrellas that become inverted in a sudden gust of wind, nanoelectromechanical switches, origami patterns and even the hopper popper, which jumps after being turned insideout. These systems typically move from one state to the other via a rapid ‘snapthrough’. Snapthrough allows plants to gradually store elastic energy, before releasing it suddenly to generate rapid motions, as in the Venus flytrap . Similarly, the beak of the hummingbird snaps through to catch insects midflight, while technological applications are increasingly exploiting snapthrough instabilities.
In all of these scenarios, it is the ability to repeatedly generate fast motions that gives snapthrough its utility. However, estimates of the speed of snapthrough suggest that it should occur more quickly than is usually observed. In their research published n Nature Physics, Oxford Mathematicians Michael Gomez, Dominic Vella and Derek Moulton study the dynamics of snapthrough in detail, showing that, even without dissipation, the dynamics slow down close to the snapthrough transition. This is reminiscent of the slowing down observed in critical phenomena (for example the time taken for oscillations in the climate to die down is thought to grow larger as a ’tipping point’ is reached). As well as providing a handheld demonstration of such phenomena, the work provides a new tool for tuning dynamic responses in applications of elastic bistability: for example it shows that to obtain faster snapthrough in applications such as robotics, the system needs to be pushed well beyond the snapthrough transition.

Tuesday, 4 October 2016 
Across the physical and biological sciences, mathematical models are formulated to capture experimental observations. Often, multiple models are developed to explore alternate hypotheses. It then becomes necessary to choose between different models.
Oxford Mathematician Heather Harrington and colleagues from the United States have explored the problem of model selection by regarding mathematical models as geometric objects in space. In general, model selection is a hard problem, but recasting it in geometric terms allows the authors to give a new methodology for selecting the best explanation of observed phenomena, thereby bringing recent groundbreaking developments in nonlinear algebra to the study of biological and other complex systems.
Specifically their paper, published today in the Royal Society Journal Interface, considers polynomial models (e.g., massaction chemical reaction networks at steady state) and describe a framework for their analysis based on optimisation using numerical algebraic geometry. The authors use probabilityone polynomial homotopy continuation methods to compute all critical points of the objective function, then filter to recover the global optima. This approach exploits the geometric structures relating models and data, and demonstrates its utility on examples from cell signalling, synthetic biology, and epidemiology.

Tuesday, 4 October 2016 
The Clinical Sciences Centre based at Imperial College in London has launched a new initiative to celebrate women in maths and computing. As a new branch of the existing Suffrage Science scheme, it will encourage women into science, and to reach senior leadership roles.
Women make up no more than four in ten undergraduates studying maths. Suffrage Science aims to make a difference. There are currently two sections, one for women in the Life Sciences, and one for those in Engineering and the Physical Sciences. Now there is a third specialism, for women in Maths and Computing. Twelve women will receive awards to celebrate their scientific achievements and ability to inspire others. Oxford Mathematician Frances Kirwan FRS is one of the first recipients of the award for Mathematics.
The awards themselves are pieces of jewellery, designed by students at the arts college Central Saint MartinsUAL, and inspired by science. One, a silver bangle, holds a secret. Engraved on the inside, and hidden beneath a layer of silver, is what many mathematicians consider the most beautiful equation in mathematics, Euler’s equation.

Wednesday, 21 September 2016 
A life belt, a coffee cup, a jumping ball, a beach ball. What do these objects have in common? What sets them apart? Questions like these come under the mathematical umbrella of topology. And the theory of homology enables us to explore and understand them. Find out more in the latest in our Oxford Mathematics Alphabet.

Saturday, 17 September 2016 
Roger HeathBrown is one of Oxford's foremost mathematicians. His work in analytic number theory has been critical to the advances in the subject over the past thirty years and garnered Roger many prizes.
As he approached retirement, Roger gave this interview to Ben Green, Waynflete Professor of Mathematics in Oxford and himself a leading figure in the field of number theory. In the interview Roger reflects on his influences, his achievements and the pleasures that mathematics has given him.

Thursday, 8 September 2016 
Oxford Mathematician and Charles Simonyi Professor for the Public Understanding of Science in the University of Oxford, Marcus du Sautoy, has been named one of London's most influential mathematicians in the London Standard Progress 1000 awards. The Progress 1000, in partnership with Citi, is an annual event hosted by The London Evening Standard to celebrate the people whose influence across many spheres of London life is felt most keenly by those who live in the City.
Marcus is not only a leading mathematician in his own right, but a driving force in the promotion and popularisation of mathematics and associated sciences. He has taken mathematics and science around the world, expressing its elegance, pleasures, and occasional pains, through lectures, theatre (his X & Y, a mathematical play written with Victoria Gould has been widely acclaimed) and most recently in his book "What We Cannot Know" where he tries to identify the frontiers to our knowledge.

Sunday, 4 September 2016 
For over a hundred years, when confronted by swelling in the brain, surgeons more often than not have resorted to decompressive craniectomy, the traditional route to reducing swelling by removing a large part of the skull. However, while this might be the standard procedure, its failure rate has been worryingly high, primarily because the consequences on the rest of the brain have been poorly understood.
This is no longer just a challenge for neurosurgeons. Mathematicians are now able to study and model the impact of surgery at a cellular level and by so doing develop a more specific picture of the impact of surgery across the whole brain. In particular Oxford Mathematician Alain Goriely and colleagues Johannes Weickenmeier and Ellen Kuhl from Stanford University have looked at the issue by studying a standard physical problem: the problem of bulging in soft solids.
Bulging is due to the swelling of a material while constrained except at an opening, as would be the case when the skull is opened during surgery and the brain bulges out of the skull creating, potentially, deformations in another part of the brain, away from the immediate incision. To quantify possible deformations inside the brain, the team created a personalised finite element craniectomy model from highresolution medical resonance imaging. Their study reveals three failure mechanisms which would suggest damage beyond the initial incision point: axonal stretch in the centre of the bulge, axonal compression at the edge of the craniectomy, and axonal shear around the opening. Strikingly, even small swelling volumes of 50ml can induce axonal strain in excess of 30% above reported damage thresholds in patients.
Such models suggest a possible mechanism for proving and identifying longterm damage. Indeed, this theoretical study is a first step towards gaining better insight into the complex mechanisms underlying craniectomy and opens the door for systematic personalised studies of craniectomy in patients. The next step is to combine this theoretical work with experimental and clinical work to enable surgeons to provide betterinformed and more successful treatments.
A fuller explanation can be found in the following papers:
Physical Review Letters
Journal of the Mechanics and Physics of Solids
Computer Methods in Applied Mechanics and Engineering
Cornell University Library

Monday, 22 August 2016 
There is a wide class of problems in mathematics known as inverse problems. Rather than starting with a mathematical model and analysing its properties, mathematicians start with a set of properties and try to obtain mathematical models which display them. For example, in mathematical chemistry researchers try to construct chemical reaction systems that have certain predefined behaviours. From a mathematical point of view, this can be used to create simplified chemical systems that can be used as test problems for different mathematical fields. From an experimental perspective, it is useful to create chemical systems that can be used as blueprints for constructing physical networks in synthetic biology, for example in the area of DNA computing.
Oxford Mathematicians Tomislav Plesa and Radek Erban, together with their colleague Tomáš Vejchodský of the Institute of Mathematics in the Czech Academy of Sciences, have recently published a paper in the Journal of Mathematical Chemistry developing the theory behind this. They have built on previous work to create an inverse problem framework suitable for constructing a reaction system, and have used this to construct certain two and threedimensional systems of particular interest. Their work concentrates primarily on networks modelled by systems of kinetic equations, a class of ordinary differential equations of a certain type. Such equations may display 'exotic phenomena', corresponding to specific biological functions in the underlying biochemical reaction networks, and so the inverse problem framework needs to be able to handle such behaviours.

Sunday, 21 August 2016 
Correctly predicting extinction is critical to ecology. Claim extinction too late, and you may be taking resources away from a species that actually could be saved. Claim extinction too early, and you may cause the true extinction due to stopping resources, such as removing protection of its habitat.
There is a balance to be sought, and it's clear that we're not quite there because every year several species that were thought to be extinct are rediscovered. This may seem good news, but it creates a lack of faith for the International Union for Conservation of Nature (IUCN) when it categorises a species as extinct.
Rediscovered species are dubbed Lazarus species, after the character in the bible that came back to life. We have data for some Lazarus mammals, and some mammals that are presumed extinct. What can we infer from the Lazarus mammals to help us establish which presumedextinct mammals are actually still alive?
Oxford Mathematician Tamsin Lee and colleagues from Australia in a paper published in Global Change Biology use information about the size of the mammal, the search effort it has received, and whether the mammal lived in dense or sparse populations. Some traits, such as the body size, may affect the chances of extinction and rediscovery  large mammals are easier to hunt, but also easier to rediscover. Whereas, factors such as search effort, will only affect the chances of rediscovery. How can we separate and quantify these effects?
To establish which mammals are likely to be rediscovered, and when, the researchers used a model that is commonly used in medicine. Suppose you're conducting a trial for a new medicine which may cure a terminal disease. You give this medicine to 100 subjects, and you make a note of the proportion which are cured. And among those which are not cured, you note how long it takes the patient to die from the disease. You also have notes about, for example, the age of the patients, their gender, cholesterol and whether they're a smoker. From these traits you can establish which patients are most likely to be cured  perhaps young female nonsmokers with low cholesterol, followed by young male nonsmokers with low cholesterol, and so on. Among those which are not cured, perhaps the medicine prolonged their life, So again, we need to establish which traits created a delay.
Applying this model to the mammal data set, the researchers quantified the effect of traits such as body size, on extinction and rediscovery. They found that indeed, large mammals, such as the Tasmanian Tiger, are more likely to go extinct, as are those mammals that live in dense populations. The effect is compounded for mammals that are both large and live in dense population, such as the Saudi Gazelle which has a 95% chance of being extinct after missing for 79 years. This chance will keep increasing until it reaches 100% in 2039.
Large mammals, which experience a medium search effort (3 to 6 searches) are likely to be rediscovered less than 50 years after they were last seen, whereas small rodentsized mammals could be missing for over a hundred years, and still be rediscovered. These time limits can be decreased with higher search effort, but search effort has a stronger effect on large mammals. That is, when choosing to allocate resources, searching for a large mammal will enable us to determine the status of the species sooner than when searching for a small mammal. The Saudi Gazelle illustrates this well, since it is a large mammal, but has not reached 100% chance of extinction despite being missing for 79 years. This is because it has received a low search effort.
The strong effect of search effort on large mammals bodes poorly for the Tasmanian Tiger, which was last seen in 1933. There has been a huge search effort, but they did not bear any certain sightings. (The question of certain and uncertain sightings is, as you can imagine, another huge topic in ecology). This implies that since 1983 the Tasmanian Tiger has been truly extinct. However, the Chinese River Dolphin, which has also received a high search effort, has only been missing for 9 years, so it has a 72% chance of being extinct, with this chance not reaching 100% until 2034.
Ulitmately this model demonstrates how even ecology, a relatively new scientific field, is advancing by capitalising upon centuries worth of mathematics.
