Forthcoming events in this series


Fri, 20 Apr 2012

10:00 - 11:30
DH 3rd floor SR

CANCELLED

Harry Walton
(Sharp Labs)
Abstract

Sorry, this has been cancelled at short notice!

Fri, 16 Mar 2012

10:00 - 13:00

BP workshop

none
(BP)
Abstract

Topic to be confirmed. (This is the postponed workshop from Michaelmas term!)

Fri, 02 Mar 2012

10:00 - 13:30
DH 1st floor SR

"Pattern of Life" and traffic

Charles Offer
(Thales UK)
Abstract

'Pattern-of-life' is a current buzz-word in sensor systems. One aspect to this is the automatic estimation of traffic flow patterns, perhaps where existing road maps are not available. For example, a sensor might measure the position of a number of vehicles in 2D, with a finite time interval between each observation of the scene. It is desired to estimate the time-average spatial density, current density, sources and sinks etc. Are there practical methods to do this without tracking individual vehicles, given that there may also be false 'clutter' detections, the density of vehicles may be high, and each vehicle may not be detected in every timestep? And what if the traffic flow has periodicity, e.g. variations on the timescale of a day?

Fri, 24 Feb 2012

11:00 - 12:30
DH 1st floor SR

computer imaging (producing accurate measurements of an object in front of a camera)

Eleanor Watson
(Poikos)
Abstract

Problem #1: (marker-less scaling) Poikos ltd. has created algorithms for matching photographs of humans to three-dimensional body scans. Due to variability in camera lenses and body sizes, the resulting three-dimensional data is normalised to have unit height and has no absolute scale. The problem is to assign an absolute scale to normalised three-dimensional data.

Prior Knowledge: A database of similar (but different) reference objects with known scales. An imperfect 1:1 mapping from the input coordinates to the coordinates of each object within the reference database. A projection matrix mapping the three-dimensional data to the two-dimensional space of the photograph (involves a non-linear and non-invertible transform; x=(M*v)_x/(M*v)_z, y=(M*v)_y/(M*v)_z).

Problem #2: (improved silhouette fitting) Poikos ltd. has created algorithms for converting RGB photographs of humans in (approximate) poses into silhouettes. Currently, a multivariate Gaussian mixture model is used as a first pass. This is imperfect, and would benefit from an improved statistical method. The problem is to determine the probability that a given three-component colour at a given two-component location should be considered as "foreground" or "background".

Prior Knowledge: A sparse set of colours which are very likely to be skin (foreground), and their locations. May include some outliers. A (larger) sparse set of colours which are very likely to be clothing (foreground), and their locations. May include several distributions in the case of multi-coloured clothing, and will probably include vast variations in luminosity. A (larger still) sparse set of colours which are very likely to be background. Will probably overlap with skin and/or clothing colours. A very approximate skeleton for the subject.

Limitations: Sample colours are chosen "safely". That is, they are chosen in areas known to be away from edges. This causes two problems; highlights and shadows are not accounted for, and colours from arms and legs are under-represented in the model. All colours may be "saturated"; that is, information is lost about colours which are "brighter than white". All colours are subject to noise; each colour can be considered as a true colour plus a random variable from a gaussian distribution. The weight of this gaussian model is constant across all luminosities, that is, darker colours contain more relative noise than brighter colours.

Fri, 17 Feb 2012

10:00 - 11:15
DH 1st floor SR

Spectral Marine Energy Converter

Peter Roberts
(VerdErg)
Abstract

A SMEC device is an array of aerofoil-shaped parallel hollow vanes forming linear venturis, perforated at the narrowest point where the vanes most nearly touch. When placed across a river or tidal flow, the water accelerates through the venturis between each pair of adjacent vanes and its pressure drops in accordance with Bernoulli’s Theorem. The low pressure zone draws a secondary flow out through the perforations in the adjacent hollow vanes which are all connected to a manifold at one end. The secondary flow enters the manifold through an axial flow turbine.

SMEC creates a small upstream head uplift of, say 1.5m – 2.5m, thereby converting some of the primary flow’s kinetic energy into potential energy. This head difference across the device drives around 80% of the flow between the vanes which can be seen to act as a no-moving-parts venturi pump, lowering the head on the back face of the turbine through which the other 20% of the flow is drawn. The head drop across this turbine, however, is amplified from, say, 2m up to, say, 8m. So SMEC is analogous to a step-up transformer, converting a high-volume low-pressure flow to a higher-pressure, lower-volume flow. It has all the same functional advantages of a step-up transformer and the inevitable transformer losses as well.

The key benefit is that a conventional turbine (or Archimedes Screw) designed to work efficiently at a 1.5m – 2.5m driving head has to be of very large diameter with a large step-up gearbox. In many real-World locations, this makes it too expensive or simply impractical, in shallow water for example.

The work we did in 2009-10 for DECC on a SMEC across the Severn Estuary concluded that compared to a conventional barrage, SMEC would output around 80% of the power at less than half the capital cost. Crucially, however, this greatly superior performance is achieved with minimal environmental impact as the tidal signal is preserved in the upstream lagoon, avoiding the severe damage to the feeding grounds of migratory birdlife that is an unwelcome characteristic of a conventional barrage.

To help successfully commercialise the technology, however, we will eventually want to build a reliable (CFD?) computer model of SMEC which even if partly parametric, would benefit hugely from an improved understanding of the small-scale turbulence and momentum transfer mechanisms in the mixing section.

Fri, 27 Jan 2012

10:00 - 11:15
DH 1st floor SR

a kinetic–dynamic modeling approach to understand the effect of a new radiotherapeutic agent on DNA damage repair

Vallis, Cornelissen, Able
(Oxford)
Abstract

DNA double strand breaks (DSB) are the most deleterious type of DNA damage induced by ionizing radiation and cytotoxic agents used in the treatment of cancer. When DSBs are formed, the cell attempts to repair the DNA damage through activation of a variety of molecular repair pathways. One of the earliest events in response to the presence of DSBs is the phosphorylation of a histone protein, H2AX, to form γH2AX. Many hundreds of copies of γH2AX form, occupying several mega bases of DNA at the site of each DSB. These large collections of γH2AX can be visualized using a fluorescence microscopy technique and are called ‘γH2AX foci’. γH2AX serves as a scaffold to which other DNA damage repair proteins adhere and so facilitates repair. Following re-ligation of the DNA DSB, the γH2AX is dephosphorylated and the foci disappear.

We have developed a contrast agent, 111In-anti-γH2AX-Tat, for nuclear medicine (SPECT) imaging of γH2AX which is based on an anti-γH2AX monoclonal antibody. This agent allows us to image DNA DSB in vitro in cells, and in in vivo model systems of cancer. The ability to track the spatiotemporal distribution of DNA damage in vivo would have many potential clinical applications, including as an early read-out of tumour response or resistance to particular anticancer drugs or radiation therapy.

The imaging tracer principle states that a contrast agent should not interfere with the physiology of the process being imaged. Therefore, we have investigated the influence of the contrast agent itself on the kinetics of DSB formation, repair and on γH2AX foci formation and resolution and now wish to synthesise these data into a coherent kinetic-dynamic model.

Fri, 09 Dec 2011

14:30 - 16:00
DH 1st floor SR

applying loads in bone tissue engineering problems

Junjie Wu
(Durham)
Abstract

Please note that this is taking place in the afternoon - partly to avoid a clash with the OCCAM group meeting in the morning.

Fri, 02 Dec 2011

10:00 - 11:15
DH 3rd floor SR

Arguing about risks: a request for assistance

John Fox
(Department of Engineering Science, University of Oxford)
Abstract

The standard mathematical treatment of risk combines numerical measures of uncertainty (usually probabilistic) and loss (money and other natural estimators of utility). There are significant practical and theoretical problems with this interpretation. A particular concern is that the estimation of quantitative parameters is frequently problematic, particularly when dealing with one-off events such as political, economic or environmental disasters. Practical decision-making under risk, therefore, frequently requires extensions to the standard treatment.

 

An intuitive approach to reasoning under uncertainty has recently become established in computer science and cognitive science in which general theories (formalised in a non-classical first-order logic) are applied to descriptions of specific situations in order to construct arguments for and/or against claims about possible events. Collections of arguments can be aggregated to characterize the type or degree of risk, using the logical grounds of the arguments to explain, and assess the credibility of, the supporting evidence for competing claims. Discussions about whether a complex piece of equipment or software could fail, the possible consequences of such failure and their mitigation, for example, can be  based on the balance and relative credibility of all the arguments. This approach has been shown to offer versatile risk management tools in a number of domains, including clinical medicine and toxicology (e.g. www.infermed.com; www.lhasa.com). Argumentation frameworks are also being used to support open discussion and debates about important issues (e.g. see debate on environmental risks at www.debategraph.org).

 

Despite the practical success of argument-based methods for risk assessment and other kinds of decision making they typically ignore measurement of uncertainty even if some quantitative data are available, or combine logical inference with quantitative uncertainty calculations in ad hoc ways. After a brief introduction to the argumentation approach I will demonstrate medical risk management applications of both kinds and invite suggestions for solutions which are mathematically more satisfactory. 

 

Definitions (Hubbard:  http://en.wikipedia.org/wiki/Risk)

Uncertainty: The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.

Measurement of uncertainty: A set of probabilities assigned to a set of possibilities. Example:"There is a 60% chance this market will double in five years"

Risk: A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.

Measurement of risk: A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of $12 million in exploratory drilling costs".

 

The conceptual background to the argumentation approach to reasoning under uncertainty is reviewed in the attached paper “Arguing about the Evidence: a logical approach”.

Fri, 11 Nov 2011

09:45 - 11:00
DH 1st floor SR

Animal Behaviour

Marian Dawkins
(Dept of Zoology, University of Oxford)
Abstract

The following two topics are likely to be discussed.

A) Modelling the collective behaviour of chicken flocks. Marian Dawkins has a joint project with Steve Roberts in Engineering studying the patterns of optical flow in large flocks of commercial broiler chickens. They have found that various measurements of flow (such as skew and kurtosis) are predictive of future mortality. Marian would be interested in seeing whether we can model these effects.
B) Asymmetrical prisoners’ dilemma games. Despite massive theoretical interest, there are very few (if any) actual examples of animals showing the predicted behaviour of reciprocity with delayed reward. Marian Dawkins suspects that the reason for this is that the assumptions made are unrealistic and she would like to explore some ideas about this.

Please note the slightly early start to accommodate the OCCAM group meeting that follows.

Fri, 04 Nov 2011

10:00 - 11:15
DH 1st floor SR

Industrial MSc project proposals

Various
(Industry)
Abstract

10am Radius Health - Mark Evans

10:30am NAG - Mick Pont and Lawrence Mulholland

Please note, that Thales are also proposing several projects but the academic supervisors have already been allocated.

Fri, 21 Oct 2011

11:15 - 12:30
DH 1st floor SR

Bio-film initiation

Ian Thompson
(Department of Engineering Science, University of Oxford)
Mon, 15 Aug 2011

10:00 - 14:00

TBA

TBA
(BP)
Abstract

This workshop will probably take place at BP's premises.

Fri, 24 Jun 2011

10:00 - 13:00
DH 1st floor SR

Medium-PRF Radar Waveform Design and Understanding

Andy Stove
(Thales UK)
Abstract

Many radar designs transmit trains of pulses to estimate the Doppler shift from moving targets, in order to distinguish them from the returns from stationary objects (clutter) at the same range. The design of these waveforms is a compromise, because when the radar's pulse repetition frequency (PRF) is high enough to sample the Doppler shift without excessive ambiguity, the range measurements often also become ambiguous. Low-PRF radars are designed to be unambiguous in range, but are highly ambiguous in Doppler. High-PRF radars are, conversely unambiguous in Doppler but highly ambiguous in range. Medium-PRF radars have a moderate degree of ambiguity (say five times) in both range and Doppler and give better overall performance.

The ambiguities mean that multiple PRFs must be used to resolve the ambiguities (using the principle of the Chinese Remainder Theorom). A more serious issue, however, is that each PRF is now 'blind' at certain ranges, where the received signal arrives at the same time as the next pulse is transmitted, and at certain Doppler shifts (target speeds), when the return is 'folded' in Doppler so that it is hidden under the much larger clutter signal.

A practical radar therefore transmits successive bursts of pulses at different PRFs to overcome the 'blindness' and to resolve the ambiguities. Analysing the performance, although quite complex if done in detail, is possible using modern computer models, but the inverse problems of synthesing waveforms with a given performance remains difficult. Even more difficult is the problem of gaining intuitive insights into the likely effect of altering the waveforms. Such insights would be extremely valuable for the design process.

This problem is well known within the radar industry, but it is hoped that by airing it to an audience with a wider range of skills, some new ways of looking at the problem might be found.

Fri, 17 Jun 2011

09:30 - 11:30
DH 1st floor SR

Student Transfer of Status presentations

Emma Warenford, Georgios Anastasiades - and on Monday 27th June, Mohit Dalwadi, Sofia Piltz - DH Common Room from 11:15
(OCIAM)
Abstract

Emma Warneford: "Formation of Zonal Jets and the Quasigeostrophic Theory of the Thermodynamic Shallow Water Equations"

Georgios Anastasiades: "Quantile forecasting of wind power using variability indices"

Fri, 27 May 2011

10:00 - 11:15
DH 1st floor SR

POSTPONED

John Fox
(Department of Engineering Science, University of Oxford)
Abstract

Due to illness the speaker has been forced to postpone at short notice. A new date will be announced as soon as possible.

Fri, 20 May 2011

10:00 - 11:15
DH 1st floor SR

Decision making on the fly

Gero Miesenboeck and Shamik DasGupta
(Physiology, Anatomy and Genetics)