# Past Industrial and Interdisciplinary Workshops

24 February 2012
11:00
to
12:30
Eleanor Watson
Abstract

Problem #1: (marker-less scaling) Poikos ltd. has created algorithms for matching photographs of humans to three-dimensional body scans. Due to variability in camera lenses and body sizes, the resulting three-dimensional data is normalised to have unit height and has no absolute scale. The problem is to assign an absolute scale to normalised three-dimensional data.

Prior Knowledge: A database of similar (but different) reference objects with known scales. An imperfect 1:1 mapping from the input coordinates to the coordinates of each object within the reference database. A projection matrix mapping the three-dimensional data to the two-dimensional space of the photograph (involves a non-linear and non-invertible transform; x=(M*v)_x/(M*v)_z, y=(M*v)_y/(M*v)_z).

Problem #2: (improved silhouette fitting) Poikos ltd. has created algorithms for converting RGB photographs of humans in (approximate) poses into silhouettes. Currently, a multivariate Gaussian mixture model is used as a first pass. This is imperfect, and would benefit from an improved statistical method. The problem is to determine the probability that a given three-component colour at a given two-component location should be considered as "foreground" or "background".

Prior Knowledge: A sparse set of colours which are very likely to be skin (foreground), and their locations. May include some outliers. A (larger) sparse set of colours which are very likely to be clothing (foreground), and their locations. May include several distributions in the case of multi-coloured clothing, and will probably include vast variations in luminosity. A (larger still) sparse set of colours which are very likely to be background. Will probably overlap with skin and/or clothing colours. A very approximate skeleton for the subject.

Limitations: Sample colours are chosen "safely". That is, they are chosen in areas known to be away from edges. This causes two problems; highlights and shadows are not accounted for, and colours from arms and legs are under-represented in the model. All colours may be "saturated"; that is, information is lost about colours which are "brighter than white". All colours are subject to noise; each colour can be considered as a true colour plus a random variable from a gaussian distribution. The weight of this gaussian model is constant across all luminosities, that is, darker colours contain more relative noise than brighter colours.

• Industrial and Interdisciplinary Workshops
17 February 2012
11:30
to
13:00
TBA
Abstract
• Industrial and Interdisciplinary Workshops
17 February 2012
10:00
to
11:15
Peter Roberts
Abstract

A SMEC device is an array of aerofoil-shaped parallel hollow vanes forming linear venturis, perforated at the narrowest point where the vanes most nearly touch. When placed across a river or tidal flow, the water accelerates through the venturis between each pair of adjacent vanes and its pressure drops in accordance with Bernoulli’s Theorem. The low pressure zone draws a secondary flow out through the perforations in the adjacent hollow vanes which are all connected to a manifold at one end. The secondary flow enters the manifold through an axial flow turbine.

SMEC creates a small upstream head uplift of, say 1.5m – 2.5m, thereby converting some of the primary flow’s kinetic energy into potential energy. This head difference across the device drives around 80% of the flow between the vanes which can be seen to act as a no-moving-parts venturi pump, lowering the head on the back face of the turbine through which the other 20% of the flow is drawn. The head drop across this turbine, however, is amplified from, say, 2m up to, say, 8m. So SMEC is analogous to a step-up transformer, converting a high-volume low-pressure flow to a higher-pressure, lower-volume flow. It has all the same functional advantages of a step-up transformer and the inevitable transformer losses as well.

The key benefit is that a conventional turbine (or Archimedes Screw) designed to work efficiently at a 1.5m – 2.5m driving head has to be of very large diameter with a large step-up gearbox. In many real-World locations, this makes it too expensive or simply impractical, in shallow water for example.

The work we did in 2009-10 for DECC on a SMEC across the Severn Estuary concluded that compared to a conventional barrage, SMEC would output around 80% of the power at less than half the capital cost. Crucially, however, this greatly superior performance is achieved with minimal environmental impact as the tidal signal is preserved in the upstream lagoon, avoiding the severe damage to the feeding grounds of migratory birdlife that is an unwelcome characteristic of a conventional barrage.

To help successfully commercialise the technology, however, we will eventually want to build a reliable (CFD?) computer model of SMEC which even if partly parametric, would benefit hugely from an improved understanding of the small-scale turbulence and momentum transfer mechanisms in the mixing section.

• Industrial and Interdisciplinary Workshops
27 January 2012
10:00
to
11:15
Vallis, Cornelissen, Able
Abstract
DNA double strand breaks (DSB) are the most deleterious type of DNA damage induced by ionizing radiation and cytotoxic agents used in the treatment of cancer. When DSBs are formed, the cell attempts to repair the DNA damage through activation of a variety of molecular repair pathways. One of the earliest events in response to the presence of DSBs is the phosphorylation of a histone protein, H2AX, to form γH2AX. Many hundreds of copies of γH2AX form, occupying several mega bases of DNA at the site of each DSB. These large collections of γH2AX can be visualized using a fluorescence microscopy technique and are called ‘γH2AX foci’. γH2AX serves as a scaffold to which other DNA damage repair proteins adhere and so facilitates repair. Following re-ligation of the DNA DSB, the γH2AX is dephosphorylated and the foci disappear. We have developed a contrast agent, 111In-anti-γH2AX-Tat, for nuclear medicine (SPECT) imaging of γH2AX which is based on an anti-γH2AX monoclonal antibody. This agent allows us to image DNA DSB in vitro in cells, and in in vivo model systems of cancer. The ability to track the spatiotemporal distribution of DNA damage in vivo would have many potential clinical applications, including as an early read-out of tumour response or resistance to particular anticancer drugs or radiation therapy. The imaging tracer principle states that a contrast agent should not interfere with the physiology of the process being imaged. Therefore, we have investigated the influence of the contrast agent itself on the kinetics of DSB formation, repair and on γH2AX foci formation and resolution and now wish to synthesise these data into a coherent kinetic-dynamic model.
• Industrial and Interdisciplinary Workshops
20 January 2012
09:30
to
11:30
none
Abstract
• Industrial and Interdisciplinary Workshops
9 January 2012
10:00
to
11:30
Dave Rugg and Dave Wright
Abstract
• Industrial and Interdisciplinary Workshops
9 December 2011
14:30
to
16:00
Junjie Wu
Abstract
<p>Please note that this is taking place in the afternoon - partly to avoid a clash with the OCCAM group meeting in the morning. </p>
• Industrial and Interdisciplinary Workshops
2 December 2011
10:00
to
11:15
Abstract

The standard mathematical treatment of risk combines numerical measures of uncertainty (usually probabilistic) and loss (money and other natural estimators of utility). There are significant practical and theoretical problems with this interpretation. A particular concern is that the estimation of quantitative parameters is frequently problematic, particularly when dealing with one-off events such as political, economic or environmental disasters. Practical decision-making under risk, therefore, frequently requires extensions to the standard treatment.

An intuitive approach to reasoning under uncertainty has recently become established in computer science and cognitive science in which general theories (formalised in a non-classical first-order logic) are applied to descriptions of specific situations in order to construct arguments for and/or against claims about possible events. Collections of arguments can be aggregated to characterize the type or degree of risk, using the logical grounds of the arguments to explain, and assess the credibility of, the supporting evidence for competing claims. Discussions about whether a complex piece of equipment or software could fail, the possible consequences of such failure and their mitigation, for example, can be  based on the balance and relative credibility of all the arguments. This approach has been shown to offer versatile risk management tools in a number of domains, including clinical medicine and toxicology (e.g. www.infermed.com; www.lhasa.com). Argumentation frameworks are also being used to support open discussion and debates about important issues (e.g. see debate on environmental risks at www.debategraph.org).

Despite the practical success of argument-based methods for risk assessment and other kinds of decision making they typically ignore measurement of uncertainty even if some quantitative data are available, or combine logical inference with quantitative uncertainty calculations in ad hoc ways. After a brief introduction to the argumentation approach I will demonstrate medical risk management applications of both kinds and invite suggestions for solutions which are mathematically more satisfactory.

Definitions (Hubbard:  http://en.wikipedia.org/wiki/Risk)

Uncertainty: The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.

Measurement of uncertainty: A set of probabilities assigned to a set of possibilities. Example:"There is a 60% chance this market will double in five years"

Risk: A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.

Measurement of risk: A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of \$12 million in exploratory drilling costs".

The conceptual background to the argumentation approach to reasoning under uncertainty is reviewed in the attached paper “Arguing about the Evidence: a logical approach”.

• Industrial and Interdisciplinary Workshops
18 November 2011
10:00
to
13:05
Glen Davidson
Abstract
• Industrial and Interdisciplinary Workshops
11 November 2011
09:45
to
11:00
Abstract

The following two topics are likely to be discussed.

A) Modelling the collective behaviour of chicken flocks. Marian Dawkins has a joint project with Steve Roberts in Engineering studying the patterns of optical flow in large flocks of commercial broiler chickens. They have found that various measurements of flow (such as skew and kurtosis) are predictive of future mortality. Marian would be interested in seeing whether we can model these effects.
B) Asymmetrical prisoners’ dilemma games. Despite massive theoretical interest, there are very few (if any) actual examples of animals showing the predicted behaviour of reciprocity with delayed reward. Marian Dawkins suspects that the reason for this is that the assumptions made are unrealistic and she would like to explore some ideas about this.

Please note the slightly early start to accommodate the OCCAM group meeting that follows.

• Industrial and Interdisciplinary Workshops