Past Industrial and Interdisciplinary Workshops

E.g., 2019-08-25
E.g., 2019-08-25
E.g., 2019-08-25
14 June 2019
10:00
Jahangir Mohammed
Abstract

The disruptive drone activity at airports requires an early warning system and Aveillant make a radar system that can do the job. The main problem is telling the difference between birds and drones where there may be one or two drones and 10s or 100s of birds. There is plenty of data including time series for how the targets move and the aim is to improve the discrimination capability of tracker using machine learning.

Specifically, the challenge is to understand whether there can be sufficient separability between birds and drones based on different features, such as flight profiles, length of the track, their states, and their dominance/correlation in the overall discrimination. Along with conventional machine learning techniques, the challenge is to consider how different techniques, such as deep neural networks, may perform in the discrimination task.

  • Industrial and Interdisciplinary Workshops
31 May 2019
10:00
Mike Beeson, Matt Davidson and James Rogers
Abstract

In Formula 1 engineers strive to produce the fastest car possible for their drivers. A lap simulation provides an objective evaluation of the performance of the car and the subsequent lap time achieved. Using this information, engineers aim to test new car concepts, determine performance limitations or compromises, and identify the sensitivity of performance to car setup parameters.

The latest state of the art lap simulation techniques use optimal control approaches. Optimisation methods are employed to derive the optimal control inputs of the car that achieve the fastest lap time within the constraints of the system. The resulting state trajectories define the complete behaviour of the car. Such approaches aim to create more robust, realistic and powerful simulation output compared to traditional methods.

In this talk we discuss our latest work in this area. A dynamic vehicle model is used within a free-trajectory solver based on direct optimal control methods. We discuss the reasons behind our design choices, our progress to date, and the issues we have faced during development. Further, we look at the short and long term aims of our project and how we wish to develop our mathematical methods in the future.

  • Industrial and Interdisciplinary Workshops
10 May 2019
10:00
Charlie Fletcher and Dan Haley
Abstract

Atom Probe Tomography is a powerful 3D mass spectrometry technique. By pulsing the sample apex with an electric field, surface atoms are ionised and collected by a detector. A 3D image of estimated initial ion positions is constructed via an image reconstruction protocol. Current protocols assume ion trajectories follow a stereographic projection. However, this method assumes a hemispherical sample apex that fails to account for varying material ionisation rates and introduces severe distortions into atomic distributions for complex material systems.

We aim to develop continuum models and use this to derive a time-dependent mapping describing how ion initial positions on the sample surface correspond to final impact positions on the detector. When correctly calibrated with experiment, such a mapping could be used for performing reconstruction.

Currently we track the sample surface using a level set method, while the electric field is solved via BEM or a FEM-BEM coupling. These field calculations must remain accurate close to the boundary. Calibrating unknown evaporation parameters with experiment requires an ensemble of models per experiment. Therefore, we are also looking to maximise model efficiency via BEM compression methods i.e. fast multipole BEM. Efficiently constructing and reliably interpolating the non-bijective trajectory mapping, while accounting for ion trajectory overlap and instabilities (at sample surface corners), also presents intriguing problems.

This project is in collaboration with Cameca, the leading manufacturer of commercial atom probe instruments. If successful in minimising distortions such a technique could become valuable within the semiconductor industry.

  • Industrial and Interdisciplinary Workshops
25 January 2019
10:00
Stephane Chretien
Abstract

Clustering is a very important task in data analytics and is usually addressed using (i) statistical tools based on maximum likelihood estimators for mixture models, (ii) techniques based on network models such as the stochastic block model, or (iii) relaxations of the K-means approach based on semi-definite programming (or even simpler spectral approaches). Statistical approaches of type (i) often suffer from not being solvable with sufficient guarantees, because of the non-convexity of the underlying cost function to optimise. The other two approaches (ii) and (iii) are amenable to convex programming but do not usually scale to large datasets. In the big data setting, one usually needs to resort to data subsampling, a preprocessing stage also known as "coreset selection". We will present this last approach and the problem of selecting a coreset for the special cases of K-means and spectral-type relaxations.

 

  • Industrial and Interdisciplinary Workshops

Pages