Past Industrial and Interdisciplinary Workshops

E.g., 2020-01-20
E.g., 2020-01-20
E.g., 2020-01-20
6 December 2019
10:00
Steve Walker
Abstract

This challenge relates to problems (of a mathematical nature) in generating optimal solutions for natural flood management.  Natural flood management involves large numbers of small scale interventions in a much larger context through exploiting natural features in place of, for example, large civil engineering construction works. There is an optimisation problem related to the catchment hydrology and present methods use several unsatisfactory simplifications and assumptions that we would like to improve on.

  • Industrial and Interdisciplinary Workshops
29 November 2019
10:00
Brian Macey
Abstract

Background

The RON test is an engine test that is used to measure the research octane number (RON) of a gasoline. It is a parameter that is set in fuels specifications and is an indicator of a fuel to partially explode during burning rather than burn smoothly.

The efficiency of a gasoline engine is limited by the RON value of the fuel that it is using. As the world moves towards lower carbon, predicting the RON of a fuel will become more important.

Typical market gasolines are blended from several hundred hydrocarbon components plus alcohols and ethers. Each component has a RON value and therefore, if the composition is known then the RON can be calculated. Unfortunately, components can have antagonistic or complimentary effects on each other and therefore this needs to be taken into account in the calculation.

Several models have been produced over the years (the RON test has been around for over 60 years) but the accuracy of the models is variable. The existing models are empirically based rather than taking into account the causal links between fuel component properties and RON performance.

Opportunity

BP has developed intellectual property regarding the causal links and we need to know if these can be used to build a functional based model. There is also an opportunity to build a better empirically based model using data on individual fuel components (previous models have grouped similar components to lessen the computing effort)

  • Industrial and Interdisciplinary Workshops
15 November 2019
10:00
Michael Hirsch
Abstract

Optical super-resolution microscopy enables the observations of individual bio-molecules. The arrangement and dynamic behaviour of such molecules is studied to get insights into cellular processes which in turn lead to various application such as treatments for cancer diseases. STFC's Central Laser Facility provides (among other) public access to super-resolution microscope techniques via research grants. The access includes sample preparation, imaging facilities and data analysis support. Data analysis includes single molecule tracking algorithms that produce molecule traces or tracks from time series of molecule observations. While current algorithms are gradually getting away from "connecting the dots" and using probabilistic methods, they often fail to quantify the uncertainties in the results. We have developed a method that samples a probability distribution of tracking solutions using the Metropolis-Hastings algorithm. Such a method can produce likely alternative solutions together with uncertainties in the results. While the method works well for smaller data sets, it is still inefficient for the amount of data that is commonly collected with microscopes. Given the observations of the molecules, tracking solutions are discrete, which gives the proposal distribution of the sampler a peculiar form. In order for the sampler to work efficiently, the proposal density needs to be well designed. We will discuss the properties of tracking solutions and the problems of the proposal function design from the point of view of discrete mathematics, specifically in terms of graphs. Can mathematical theory help to design a efficient proposal function?

  • Industrial and Interdisciplinary Workshops
8 November 2019
10:00
Abstract

We will present three problems that we are interested in:

Forecast of volatility both at the instrument and portfolio level by combining a model based approach with data driven research
We will deal with additional complications that arise in case of instruments that are highly correlated and/or with low volumes and open interest.
Test if volatility forecast improves metrics or can be used to derive alpha in our trading book.

Price predication using physical oil grades data
Hypothesis:
Physical markets are most reflective of true fundamentals. Derivative markets can deviate from fundamentals (and hence physical markets) over short term time horizons but eventually converge back. These dislocations would represent potential trading opportunities.
The problem:
Can we use the rich data from the physical market prices to predict price changes in the derivative markets?
Solution would explore lead/lag relationships amongst a dataset of highly correlated features. Also explore feature interdependencies and non-linearities.
The prediction could be in the form of a price target for the derivative (‘fair value’), a simple direction without magnitude, or a probabilistic range of outcomes.

Modelling oil balances by satellite data
The flow of oil around the world from being extracted, refined, transported and consumed, forms a very large dynamic network. At both regular and irregular intervals, we can make noisy measurements of the amount of oil at certain points in the network.
In addition, we have general macro-economic information about the supply and demand of oil in certain regions.
Based on that information, with general information about the connections between nodes in the network i.e. the typical rate of transfer, one can build a general model for how oil flows through the network.
We would like to build a probabilistic model on the network, representing our belief about the amount of oil stored at each of our nodes, which we refer to as balances.
We want to focus on particular parts of the network where our beliefs can be augmented by satellite data, which can be done by focusing on a sub network containing nodes that satellite measurements can be applied to.

  • Industrial and Interdisciplinary Workshops
25 October 2019
10:00
Wayne Arter
Abstract

The challenge is to produce a reduced order model which predicts the maximum temperature rise of a thermally conducting object subjected to a power deposition profile supplied by an external code. The target conducting object is basically cuboidal but with one or more shaped faces and may have complex internal cooling structures, the deposition profile may be time dependent and exhibit hot spots and sharp edged shadows among other features. An additional feature is the importance of radiation which makes the problem nonlinear, and investigation of control strategies is also of interest. Overall there appears to be a sequence of problems of degree of difficulty sufficient to tax the most gifted student, starting with a line profile on a cuboid (quasi-2D) with linearised radiation term, and moving towards increased difficulty.

  • Industrial and Interdisciplinary Workshops
14 June 2019
10:00
Jahangir Mohammed
Abstract

The disruptive drone activity at airports requires an early warning system and Aveillant make a radar system that can do the job. The main problem is telling the difference between birds and drones where there may be one or two drones and 10s or 100s of birds. There is plenty of data including time series for how the targets move and the aim is to improve the discrimination capability of tracker using machine learning.

Specifically, the challenge is to understand whether there can be sufficient separability between birds and drones based on different features, such as flight profiles, length of the track, their states, and their dominance/correlation in the overall discrimination. Along with conventional machine learning techniques, the challenge is to consider how different techniques, such as deep neural networks, may perform in the discrimination task.

  • Industrial and Interdisciplinary Workshops
31 May 2019
10:00
Mike Beeson, Matt Davidson and James Rogers
Abstract

In Formula 1 engineers strive to produce the fastest car possible for their drivers. A lap simulation provides an objective evaluation of the performance of the car and the subsequent lap time achieved. Using this information, engineers aim to test new car concepts, determine performance limitations or compromises, and identify the sensitivity of performance to car setup parameters.

The latest state of the art lap simulation techniques use optimal control approaches. Optimisation methods are employed to derive the optimal control inputs of the car that achieve the fastest lap time within the constraints of the system. The resulting state trajectories define the complete behaviour of the car. Such approaches aim to create more robust, realistic and powerful simulation output compared to traditional methods.

In this talk we discuss our latest work in this area. A dynamic vehicle model is used within a free-trajectory solver based on direct optimal control methods. We discuss the reasons behind our design choices, our progress to date, and the issues we have faced during development. Further, we look at the short and long term aims of our project and how we wish to develop our mathematical methods in the future.

  • Industrial and Interdisciplinary Workshops

Pages