Tuesday, 23 February 2016 
CalabiYau manifolds have become a topic of study in both mathematics and physics, dissolving the boundaries between the two subjects.
A manifold is a type of geometrical space where each small region looks like normal Euclidean space. For example, an ant on the surface of the Earth sees its world as flat, rather than the curved surface of the sphere. CalabiYau manifolds are complex manifolds, that is, they can be disassembled into patches which look like flat complex space. What makes them so special is that these patches can only be joined together by the complex analogue of a rotation.
Proving a conjecture of Eugenio Calabi, ShingTung Yau has shown that CalabiYau manifolds have a property which is very interesting to physics. Einstein's equations show that spacetime curves according to the distribution of energy and momentum. But what if space is all empty? By Yau's theorem, not only is flat space a solution but so are CalabiYau manifolds. Furthermore, for this reason, CalabiYau spaces are possible candidates for the shape of extra spatial dimensions in String Theory.
Find out more from Oxford Mathematician Dr Andreas Braun in this latest instalment of our Oxford Mathematics Alphabet.

Monday, 22 February 2016 
How are people, infrastructure and economic activity organised and interrelated? It is an intractable problem with everchanging infinite factors of history, geography, economy and culture playing their part.
But a paper by Oxford Mathematician Hyejin Youn and colleagues suggests “a mathematical function common to all cities.”
Think of the city as an ecosystem, types of businesses as species interacting in that system. Ecosystems in the natural world often share common patterns in distributions of species. That got the researchers thinking. Maybe the same consistency arises in the city too. Only instead of the food web, it’s people and money and businesses that require one another. We usually think of cities as unique. London is very different from Moscow. But, it turns out, what governs the distribution of their resources stays the same across the board.
The team analysed more than 32 million establishments in U.S. metro regions. An establishment, the unit of analysis of their study, indicates “a single physical location where business is conducted”. When the team measured relative sizes of business types (e.g. agriculture, finance, and manufacturing) in each and every city, and compared these distributions among cities, the universal law is found: despite widely different mixes of types of businesses and across differentsized cities, the shape of these distributions was completely universal. Cities have their own underlying dynamics. It doesn’t matter where they are, how old they are and who is in charge.
This underlying pattern allowed researchers to build a stochastic model. As cities grow, the total number of establishments is linearly proportional to its population size (more people, more businesses). When an establishment is created it differentiates from any existing types with a probability which determines how diversified a city is given its size. This probability turns out to be inversely proportional to city size: the more businesses, the harder it is to differentiate them from existing businesses. This process, with further research, displays an openended, neverending, albeit slowing, diversification of businesses in a statistically predictable way, constituting a human ecosystem.
For a fuller explanation of the work also see articles in Forbes and Next Cities.

Friday, 19 February 2016 
Many of us know the feeling of standing in front of a subway map in a strange city, baffled by the multicoloured web staring back at us and seemingly unable to plot a route from point A to point B. Now, a team of physicists and mathematicians has attempted to quantify this confusion and find out whether there is a point at which navigating a route through a complex urban transport system exceeds our cognitive limits.
After analysing the world’s 15 largest metropolitan transport networks, the researchers estimated that the information limit for planning a trip is around 8 bits. (A ‘bit’ is binary digit – the most basic unit of information.)
Additionally, similar to the ‘Dunbar number’, which estimates a limit to the size of an individual’s friendship circle, this cognitive limit for transportation suggests that maps should not consist of more than 250 connection points to be easily readable.
Using journeys with exactly two connections as their basis (that is, visiting four stations in total), the researchers found that navigating transport networks in major cities – including London – can come perilously close to exceeding humans’ cognitive powers.
And when further interchanges or other modes of transport – such as buses or trams – are added to the mix, the complexity of networks can rise well above the 8bit threshold. The researchers demonstrated this using the multimodal transportation networks from New York City, Tokyo, and Paris.
Mason Porter, Professor of Nonlinear and Complex Systems in the Mathematical Institute at the University of Oxford, said: ‘Human cognitive capacity is limited, and cities and their transportation networks have grown to the point where they have reached a level of complexity that is beyond human processing capability to navigate around them. In particular, the search for a simplest path becomes inefficient when multiple modes of transport are involved and when a transportation system has too many interconnections.’
Professor Porter added: ‘There are so many distractions on these transport maps that it becomes like a game of Where’s Waldo? [Where’s Wally?]
‘Put simply, the maps we currently have need to be rethought and redesigned in many cases. Journeyplanner apps of course help, but the maps themselves need to be redesigned.
‘We hope that our paper will encourage more experimental investigations on cognitive limits in navigation in cities.’
The research – a collaboration between the University of Oxford, Institut de Physique Théorique at CEASaclay, and Centre d’Analyse et de Mathématique Sociales at EHESS Paris – is published in the journal Science Advances.

Monday, 15 February 2016 
In celebration of Nigel Hitchin's 70th birthday and in honour of his contributions to mathematics, a group of his former students and his colleague Frances Kirwan, in partnership with the Clay Mathematics Institute, are organising a conference in September 2016. It will begin in Aarhus with a workshop on differential geometry and quantization and end in Madrid with a workshop on Higgs bundles and generalized geometry, with a meeting in Oxford in between aimed at a general audience of geometers.
The three components of the conference are:
Hitchin70: Differential Geometry and Quantization, QGM, Aarhus, 58 Sept. 2016
Hitchin70: Mathematical Institute, Oxford, 911 Sept. 2016
Hitchin70: Celebrating 30 years of Higgs bundles and 15 years of generalized geometry, Residencia la Cristalera, Miraflores de la Sierra (Madrid), 1216 Sept. 2016
More information, including registration, can be found at http://projects.au.dk/hitchin70/
The confirmed speakers at the Oxford component of Hitchin70, which is supported by the London Mathematical Society are:
Sasha Beilinson
Fedor Bogomolov
Philip Candelas
Bill Goldman
Klaus Hulek
Maxim Kontsevich
Marta Mazzoco
Shigefumi Mori
ShingTung Yau
A poster can be downloaded here
Nigel Hitchin is one of the most influential figures in the field of differential and algebraic geometry and its relations with the equations of mathematical physics. He has made fundamental contributions, opening entire new areas of research in fields as varied as spin geometry, instanton and monopole equations, twistor theory, symplectic geometry of moduli spaces, integrables systems, Higgs bundles, Einstein metrics, hyperkähler geometry, Frobenius manifolds, Painlevé equations, special Lagrangian geometry and mirror symmetry, generalized geometry and beyond. He is the Savilian Professor of Geometry at University of Oxford and was previously the Rouse Ball Professor of Mathematics at Cambridge University. He is a Fellow of the Royal Society and has been the President of the London Mathematical Society.

Tuesday, 2 February 2016 
Semantics is the study of meaning as expressed through language, and it provides indirect access to an underlying level of conceptual structure. However, to what degree this conceptual structure is universal or is due to cultural histories, or to the environment inhabited by a speech community, is still controversial. Meaning is notoriously difficult to measure, let alone parameterise, for quantitative comparative studies.
Using crosslinguistic dictionaries across languages carefully selected as an unbiased sample reflecting the diversity of human languages, Oxford Mathematician Hyejin Youn and colleagues provide an empirical measure of semantic relatedness between concepts. Their analysis uncovers a universal structure underlying the sampled vocabulary across language groups independent of their evolutionary phylogenetic (evolutionary) relations, their speakers’ culture, and their geographic environment.

Monday, 1 February 2016 
How a complex dynamic network such as the human brain gives rise to consciousness has yet to be established by science. A popular view among many neuroscientists is that, through a variety of learning paradigms, the brain builds relationships and in the context of these relationships a brain state acquires meaning in the form of the relational content of the corresponding experience.
Indeed, whilst it is very difficult to explain why a colour looks the way it does, it is easy to see that consciousness is awash with relationships and associations; for example, red has a stronger relationship to orange than to green, relationships between points in our field of view give rise to geometry, some smells are similar whilst others are very different, and there’s an enormity of other relationships involving many senses such as between the sound of someone’s name, their visual appearance and the timbre of their voice. Moreover, relationships in various forms are also ubiquitous in mathematical structures and have given rise to whole areas of investigation such as graph theory.
It’s perhaps surprising then that mathematicians haven’t rushed to provide a mathematical theory for how the brain defines the relational content of consciousness, and consequently there are few mathematical theories of consciousness. However, in November 2015, Oxford Mathematician Dr Jonathan Mason had a research article published in the journal Complexity that provides an approach to this challenge.
The theory stems from information theory and introduces a version of Shannon entropy that includes relationships as parameters. The resulting entropy value (referred to as Float entropy) is a measure of the expected amount of information required to specify the state of the system beyond what is already known about the system from the relationship parameters. It turns out that, for nonrandom systems, certain choices for the relationship parameters are isolated from the rest in the sense that they give much lower float entropy values and, hence, the system defines relationships. One outcome of the work suggests that many relationships are determined in a mutually dependent way such as the relationships between colours and those that give the geometry of the field of view.
One other mathematical theory of consciousness is called Integrated Information Theory. Its initial development prioritised quantifying the level of consciousness of a system and, consequently, it is illsuited for determining relational content. However, under further development, the two theories may turn out to be somewhat complementary.

Wednesday, 27 January 2016 
Understanding how droplets impact surfaces is important for a huge range of different applications. These range from spray painting, inkjet printing, fertiliser application and rainfall to crimescene bloodsplatter analysis and hygiene situations (men’s urinals being a familiar example). High speed movies show that when droplets hit surfaces fast enough, they often splash, emitting a corona of new, tiny droplets on impact.
However, Rob Style (Oxford Mathematics) and Alfonso CastrejonPita (Oxford Engineering) noticed that if you drop a droplet on a soft surface, the splashing seems to disappear. They were curious to know if they could use this to develop an easy way to make splashproof surfaces. So, over the summer, they joined up with Oxford Mathematics undergraduate Chris Howland to carry out experiments in the Mathematical Institute's new experimental facilities, and in Engineering. Chris performed a range of experiments which involved filming impacting drops using a highspeed camera at up to 60,000 frames per second. These conclusively showed that the softer a surface, the better it is at preventing splashing.
A full description of the work can be found in Physical Review Letters. If you are interested in learning more, Chris’s prizewinning poster (the best vacation bursary poster in the Division this year) is currently on display in the Mathematical Institute in Oxford.

Tuesday, 26 January 2016 
Matjaz Leonardis on Group Theory, Henrique Rui Neves Aguiar on why the Antarctic is so big, Yiliu Wang on Probability, Joe Pollard on Quantum Chaos, Cameron Whitehead on Dmodules and Chan Bae on Embedding Graphs demonstrated the range of work going on at undergraduate level. Chan Bae won the GCHQ prize for the best presentation. Matjaz Leonardis was also shortlisted.
This year's event is organised and hosted by the University of Greenwich together with the Institute of Mathematics and its Applications (IMA).

Thursday, 17 December 2015 
Christmas is the time of year when you really need solutions. Presents to buy, who to invite to parties, who to talk to at parties. And of course the biggest dilemma is for Santa himself, traversing the globe in the early hours. So much to do, so little time.
So what you need is something to make Christmas a little easier. And what better than mathematics? Because mathematics can answer all your questions, from best party configurations, to the optimum number of presents to mapping Santa's quickest route.
Or can it? Perhaps there are some things that even mathematics cannot answer.
In the Oxford Mathematics Annual Christmas Lecture Marcus du Sautoy explores the mathematics of the festive season.
The Oxford Mathematics Christmas Lecture is generously sponsored by GResearch  Researching investment ideas to predict financial markets.

Wednesday, 16 December 2015 
Everyone knows that Moore’s law says that computers get cheaper at an exponential rate. What is not as well known is that many other technologies that have nothing to do with computers obey a similar law. Costs for DNA sequencing, some forms of renewable energy, chemical processes and consumer goods have also dropped at an exponential rate, even if the rates vary and are typically slower than for computers. Doyne Farmer and Francois Lafond from Oxford University's Mathematical Institute and Institute for New Economic Thinking at the Oxford Martin School have developed a mathematical model for explaining how the uncertainties in applying Moore’s law grow in time and show both how to forecast future costs and how to predict the accuracy of these forecasts across technologies. They analyse historical data on the costs of many different technologies and test their methods by pretending to be in the past and predicting the present. This makes it clear how to combine and compare the results of forecasting different technologies, using results from forecasting technology A to get confidence in the ability to forecast technology B.
Their analysis makes it clear that technologies are very different in the rate at which they improve. For example consider technologies for generating electricity. Once adjusted for inflation, the price of fossil fuel technologies such as coal, oil and natural gas is close to what it was a century ago. In contrast, solar photovoltaic modules have dropped in price at a rate of about 10% per year and are a factor of several thousand cheaper than they were when they were introduced in the mid 1950s. The analysis of many technologies indicates that while these trends are not ironclad, they do tend to be persistent. Forecasts show that solar photovoltaic modules are very likely to continue to get cheaper, and that if other factors such as distribution costs come down too, their electricity is likely to be cheaper than conventional sources within two decades. That said, there is a small risk that solar photovoltaic modules will be as or more expensive than they are now. The fact that it is very likely that solar energy will get cheaper is one of the few pieces of good news concerning climate change.
Doyne and Francois's paper can be found here.
