Past Industrial and Interdisciplinary Workshops

26 April 2013
10:00
to
11:15
Charles Offer
Abstract

Please note the change of venue!

Suppose there is a system where certain objects move through a network. The objects are detected only when they pass through a sparse set of points in the network. For example, the objects could be vehicles moving along a road network, and observed by a radar or other sensor as they pass through (or originate or terminate at) certain key points in the network, but which cannot be observed continuously and tracked as they travel from one point to another. Alternatively they could be data packets in a computer network. The detections only record the time at which an object passes by, and contain no information about identity that would trivially allow the movement of an individual object from one point to another to be deduced. It is desired to determine the statistics of the movement of the objects through the network. I.e. if an object passes through point A at a certain time it is desired to determine the probability density that the same object will pass through a point B at a certain later time.

The system might perhaps be represented by a graph, with a node at each point where detections are made. The detections at each node can be represented by a signal as a function of time, where the signal is a superposition of delta functions (one per detection). The statistics of the movement of objects between nodes must be deduced from the correlations between the signals at each node. The problem is complicated by the possibility that a given object might move between two nodes along several alternative routes (perhaps via other nodes or perhaps not), or might travel along the same route but with several alternative speeds.

What prior knowledge about the network, or constraints on the signals, are needed to make this problem solvable? Is it necessary to know the connections between the nodes or the pdfs for the transition time between nodes a priori, or can this be deduced? What conditions are needed on the information content of the signals? (I.e. if detections are very sparse on the time scale for passage through the network then the transition probabilities can be built up by considering each cascade of detections independently, while if detections are dense then it will presumably be necessary to assume that objects do not move through the network independently, but instead tend to form convoys that are apparent as a pattern of detections that persist for some distance on average). What limits are there on the noise in the signal or amount of unwanted signal, i.e. false detections, or objects which randomly fail to be detected at a particular node, or objects which are detected at one node but which do not pass through any other nodes? Is any special action needed to enforce causality, i.e. positive time delays for transitions between nodes?

  • Industrial and Interdisciplinary Workshops
8 March 2013
09:45
to
11:00
Nick Hall-Taylor
Abstract
In vertical annular two-phase flow, large amplitude waves ("disturbance waves") are the most significant means by which the liquid is transported by the action of the gas phase. The presentation is of certain experimental results with the intention of defining a conceptual model suitable for possible mathematical interpretation. These large waves have been studied for over 50 years but there has been little corresponding advance in the mathematical understanding of the phenomenon. The aim of the workshop is to discuss what analysis might be possible and how this might contribute to the understanding of the phenomena involved.
  • Industrial and Interdisciplinary Workshops
22 February 2013
10:00
to
11:37
Abstract

We wish to discuss the role of Modelling in Health Care. While risk factor prevalences vary and change with time it is difficult to anticipate the change in disease incidence that will result without accurately modelling the epidemiology. When detailed study of the prevalence of obesity, tobacco and salt intake, for example, are studied clear patterns emerge that can be extrapolated into the future. These can give rise to estimated probability distributions of these risk factors across age, sex, ethnicity, social class groups etc into the future. Micro simulation of individuals from defined populations (eg England 2012) can then estimate disease incidence, prevalence, death, costs and quality of life. Thus future health and other needs can be estimated, and interventions on these risk factors can be simulated for their population effect. Health policy can be better determined by a realistic characterisation of public health. The Foresight microsimulation modelling of the National Heart Forum (UK Health Forum) will be described. We will emphasise some of the mathematical and statistical issues associated with so doing.

  • Industrial and Interdisciplinary Workshops
15 February 2013
10:00
to
11:15
Abstract

InSAR (Interferometric Synthetic Aperture Radar) is an important space geodetic technique (i.e. a technique that uses satellite data to obtain measurements of the Earth) of great interest to geophysicists monitoring slip along fault lines and other changes to shape of the Earth. InSAR works by using the difference in radar phase returns acquired at two different times to measure displacements of the Earth’s surface. Unfortunately, atmospheric noise and other problems mean that it can be difficult to use the InSAR data to obtain clear measurements of displacement.

Persistent Scatterer (PS) InSAR is a later adaptation of InSAR that uses statistical techniques to identify pixels within an InSAR image that are dominated by a single back scatterer, producing high amplitude and stable phase returns (Feretti et al. 2001, Hooper et al. 2004). PS InSAR has the advantage that it (hopefully) chooses the ‘better’ datapoints, but it has the disadvantage that it throws away a lot of the data that might have been available in the original InSAR signal.

InSAR and PS InSAR have typically been used in isolation to obtain slip-rates across faults, to understand the roles that faults play in regional tectonics, and to test models of continental deformation. But could they perhaps be combined? Or could PS InSAR be refined so that it doesn’t throw away as much of the original data? Or, perhaps, could the criteria used to determine what data are signal and what are noise be improved?

The key aim of this workshop is to describe and discuss the techniques and challenges associated with InSAR and PS InSAR (particularly the problem of atmospheric noise), and to look at possible methods for improvement, by combining InSAR and PS InSAR or by methods for making the choice of thresholds.

  • Industrial and Interdisciplinary Workshops
23 November 2012
10:00
to
11:30
Andreas Duering
Abstract

The University of Oxford’s modelling4all software is a wonderful tool to simulate early medieval populations and their cemeteries in order to evaluate the influence of palaeodemographic variables, such as mortality, fertility, catastrophic events and disease on settlement dispersal. In my DPhil project I will study archaeological sites in Anglo-Saxon England and the German south-west in a comparative approach. The two regions have interesting similarities in their early medieval settlement pattern and include some of the first sites where both cemeteries and settlements were completely excavated.

An important discovery in bioarchaeology is that an excavated cemetery is not a straightforward representation of the living population. Preservation issues and the limitations of age and sex estimation methods using skeletal material must be considered. But also the statistical procedures to calculate the palaeodemographic characteristics of archaeological populations are procrustean. Agent-based models can help archaeologists to virtually bridge the chasm between the excavated dead populations and their living counterparts in which we are really interested in.

This approach leads very far away from the archaeologist’s methods and ways of thinking and the major challenge therefore is to balance innovative ideas with practicability and tangibility.

Some of the problems for the workshop are:

1.) Finding the best fitting virtual living populations for the excavated cemeteries

2.) Sensitivity analyses of palaeodemographic variables

3.) General methodologies to evaluate the outcome of agent based models

4.) Present data in a way that is both statistically correct and up to date & clear for archaeologists like me

5.) Explore how to include analytical procedures in the model to present the archaeological community with a user-friendly and not necessarily overwhelming toolkit

 

  • Industrial and Interdisciplinary Workshops
16 November 2012
10:00
to
13:00
Owen Thomas
Abstract

The task is to estimate approach time (time-to-go (TTG)) of non-ballistic threats (e.g. missiles) using passive infrared imagery captured from a sensor on the target platform (e.g. a helicopter). The threat information available in a frame of data is angular position and signal amplitude.

A Kalman filter approach is presented that is applied to example amplitude data to estimate TTG. Angular information alone is not sufficient to allow analysis of missile guidance dynamics to provide a TTG estimate. Detection of the launch is required as is additional information in the form of a terrain database to determine initial range. Parameters that relate to missile dynamics might include proportional navigation constant and motor thrust. Differences between actual angular position observations and modelled values can beused to form an estimator for the parameter set and thence to the TTG.

The question posed here is, "how can signal amplitude information be employed to establish observability in a state-estimation-based model of the angular data to improve TTG estimate performance without any other source of range information?"

  • Industrial and Interdisciplinary Workshops

Pages