Past Industrial and Interdisciplinary Workshops

4 March 2016
10:00
Mike Newman
Abstract

On the railway network, for example, there is a large base of installed equipment with a useful life of many years.  This equipment has condition monitoring that can flag a fault when a measured parameter goes outside the permitted range.  If we can use existing measurements to predict when this would occur, preventative maintenance could be targeted more effectively and faults reduced.  As an example, we will consider the current supplied to a points motor as a function of time in each operational cycle.

  • Industrial and Interdisciplinary Workshops
26 February 2016
10:00
Abstract
Ionic liquids are salts, composed solely of positive and negative ions, which are liquid under ambient conditions. Despite an increasing range of successful applications, there remain fundamental challenges in understanding the intermolecular forces and propagation of fields in ionic liquids. 
I am an experimental scientist, and in my laboratory we study thin films of liquids. The aim is to discover their molecular and surface interactions and fluid properties in confinement. In this talk I will describe the experiments and show some results which have led to better understanding of ionic liquids. I will then show some measurements which currently have no understanding attached! 
  • Industrial and Interdisciplinary Workshops
29 January 2016
10:00
Marco Brambilla
Abstract

“Market-Basket (MB) and Household (HH) data provide a fertile substrate for the inference of association between marketing activity (e.g.: prices, promotions, advertisement, etc.) and customer behaviour (e.g.: customers driven to a store, specific product purchases, joint product purchases, etc.). The main aspect of MB and HH data which makes them suitable for this type of inference is the large number of variables of interest they contain at a granularity that is fit for purpose (e.g.: which items are bought together, at what frequency are items bought by a specific household, etc.).

A large number of methods are available to researchers and practitioners to infer meaningful networks of associations between variables of interest (e.g.: Bayesian networks, association rules, etc.). Inferred associations arise from applying statistical inference to the data. In order to use statistical association (correlation) to support an inference of causal association (“which is driving which”), an explicit theory of causality is needed.

Such a theory of causality can be used to design experiments and analyse the resultant data; in such a context certain statistical associations can be interpreted as evidence of causal associations.

On observational data (as opposed to experimental), the link between statistical and causal associations is less straightforward and it requires a theory of causality which is formal enough to support an appropriate calculus (e.g.: do-calculus) of counterfactuals and networks of causation.

My talk will be focused on providing retail analytic problems which may motivate an interest in exploring causal calculi’s potential benefits and challenges.”

  • Industrial and Interdisciplinary Workshops
4 December 2015
10:00
Abstract

Multidimensional single molecule microscopy (MSMM) generates image time series of biomolecules in a cellular environment that have been tagged with fluorescent labels. Initial analysis steps of such images consist of image registration of multiple channels, feature detection and single particle tracking. Further analysis may involve the estimation of diffusion rates, the measurement of separations between molecules that are not optically resolved and more. The analysis is done under the condition of poor signal to noise ratios, high density of features and other adverse conditions. Pushing the boundary of what is measurable, we are facing among others the following challenges. Firstly the correct assessment of the uncertainties and the significance of the results, secondly the fast and reliable identification of those features and tracks that fulfil the assumptions of the models used. Simpler models require more rigid preconditions and therefore limiting the usable data, complexer models are theoretically and especially computationally challenging.

  • Industrial and Interdisciplinary Workshops
20 November 2015
10:00
Graeme Clark
Abstract

Lein’s confocal systems make accurate and precise measurements in many different applications. In applications where the object under test introduces variability and/or optical aberrations to the optical signal, the accuracy and precision may deteriorate. This technical challenge looks for mathematical solutions to improve the accuracy and precision of measurements made in such circumstances.

The presentation will outline the confocal principle, show “perfect” signals, give details of how we analyse such signals, then move on to less perfect signals and the effects on measurement accuracy and precision.

  • Industrial and Interdisciplinary Workshops
13 November 2015
10:00
Abstract

Parallelizing the time domain in numerical simulations is non-intuitive, but has been proven to be possible using various algorithms like parareal, PFASST and RIDC. Temporal parallelizations adds an entire new dimension to parallelize and significantly enhances use of super computing resources. Exploiting this technique serves as a big step towards exascale computation.

Starting with relatively simple problems, the parareal algorithm (Lions et al, A ''parareal'' in time discretization of PDE's, 2001) has been successfully applied to various complex simulations in the last few years (Samaddar et al, Parallelization in time of numerical simulations of fully-developed plasma turbulence using the parareal algorithm, 2010). The algorithm involves a predictor-corrector technique.

Numerical studies of the edge of magnetically confined, fusion plasma are an extremely challenging task. The complexity of the physics in this regime is particularly increased due to the presence of neutrals as well as the interaction of the plasma with the wall. These simulations are extremely computationally intensive but are key to rapidly achieving thermonuclear breakeven on ITER-like machines.

The SOLPS code package (Schneider et al, Plasma Edge Physics with B2‐Eirene, 2006) is widely used in the fusion community and has been used to design the ITER divertor. A reduction of the wallclock time for this code has been a long standing goal and recent studies have shown that a computational speed-up greater than 10 is possible for SOLPS (Samaddar et al, Greater than 10x Acceleration of fusion plasma edge simulations using the Parareal algorithm, 2014), which is highly significant for a code with this level of complexity.

In this project, the aim is to explore a variety of cases of relevance to ITER and thus involving more complex physics to study the feasibility of the algorithm. Since the success of the parareal algorithm heavily relies on choosing the optimum coarse solver as a predictor, the project will involve studying various options for this purpose. The tasks will also include performing scaling studies to optimize the use of computing resources yielding maximum possible computational gain.

  • Industrial and Interdisciplinary Workshops
6 November 2015
10:00
Chuck Brunner
Abstract

Blenders and food processors have been around for years.  However, detailed understanding of the fluid and particle dynamics going on with in the multi-phase flow of the processing chamber as well as the influence of variables such as the vessel geometry, blade geometry, speeds, surface properties etc., are not well understood.  SharkNinja would like Oxford Universities help in developing a model that can be used to gain insight into fluid dynamics within the food processing chamber with the goal being to develop a system that will produce better food processing performance as well as predict loading on food processing elements to enable data driven product design.

Many vacuum cleaners sold claim “no loss of suction” which is defined as having only a very small reduction in peak air power output over the life of the unit under normal operating conditions.  This is commonly achieved by having a high efficiency cyclonic separator combined with a filter which the user washes at regular intervals (typically every 3 months).  It has been observed that some vacuum cleaners show an increase in peak air watts output after a small amount of dust is deposited on the filter.  This effect is beneficial since it prolongs the time between filter washing.  SharkNinja are currently working on validating their theory as to why this occurs.  SharkNinja would like Oxford University’s help in developing a model that can be used to better understand this effect and provide insight towards optimizing future designs.

Although a very simple system from a construction standpoint, creating a drip coffee maker that can be produce a range of coffee sizes from a single cup to a multi-cup carafe presents unique problems.  Challenges within this system result from varying pressure heads on the inlet side, accurate measurement of relatively low flow rates, fluid motive force generated by boilers, and head above the boiler on the outlet side.  Getting all of these parameters right to deliver the proper strength, proper temp, and proper volume of coffee requires in depth understanding of the fluid dynamics involved in the system.  An ideal outcome from this work would be an adaptive model that enables a fluid system model to be created from building blocks.  This system model would include component models for tubing, boilers, flow meters, filters, pumps, check valves, and the like.

  • Industrial and Interdisciplinary Workshops

Pages