11:00
11:00
11:00
Rossby wave dynamics of the extra-tropical response to El Nino. Part 2
Ocean forcing of ice sheet change in West Antarctica
Abstract
The part of the West Antarctic Ice Sheet that drains into the Amundsen Sea is currently thinning at such a rate that it contributes nearly 10 percent of the observed rise in global mean sea level. Acceleration of the outlet glaciers means that the sea level contribution has grown over the past decades, while the likely future contribution remains a key unknown. The synchronous response of several independent glaciers, coupled with the observation that thinning is most rapid at their downstream ends, where the ice goes afloat, hints at an oceanic driver. The general assumption is that the changes are a response to an increase in submarine melting of the floating ice shelves that has been driven in turn by an increase in the transport of ocean heat towards the ice sheet. Understanding the causes of these changes and their relationship with climate variability is imperative if we are to make quantitative estimates of sea level into the future.
Observations made since the mid‐1990s on the Amundsen Sea continental shelf have revealed that the seabed troughs carved by previous glacial advances guide seawater around 3‐4°C above the freezing point from the deep ocean to the ice sheet margin, fuelling rapid melting of the floating ice. This talk summarises the results of several pieces of work that investigate the chain of processes linking large‐scale atmospheric processes with ocean circulation over the continental shelf and beneath the floating ice shelves and the eventual transfer of heat to the ice. While our understanding of the processes is far from complete, the pieces of the jigsaw that have been put into place give us insight into the potential causes of variability in ice shelf melting, and allow us to at least formulate some key questions that still need to be answered in order to make reliable projections of future ice sheet evolution in West Antarctica.
13:15
Turbidity current dynamics - modelling sediment avalanches in the ocean
Abstract
Turbidity currents are fast-moving streams of sediment in the ocean
which have the power to erode the sea floor and damage man-made
infrastructure anchored to the bed. They can travel for hundreds of
kilometres from the continental shelf to the deep ocean, but they are
unpredictable and can occur randomly without much warning making them
hard to observe and measure. Our main aim is to determine the distance
downstream at which the current will become extinct. We consider the
fluid model of Parker et al. [1986] and derive a simple shallow-water
description of the current which we examine numerically and analytically
to identify supercritical and subcritical flow regimes. We then focus on
the solution of the complete model and provide a new description of the
turbulent kinetic energy. This extension of the model involves switching
from a turbulent to laminar flow regime and provides an improved
description of the extinction process.
Probability Forecasting: Looking Under the Hood and at the Road Ahead
Abstract
Probability does not exist. At least no more so than "mass" "spin" or "charm" exist. Yet probability forecasts are common, and there are fine reasons for deprecating point forecasts, as they require an unscientific certainty in exactly what the future holds. What roles do our physical understanding and laws of physics play in the construction of probability forecasts to support of decision making and science-based policy? Will probability forecasting more likely accelerate or retard the advancement of our scientific understanding?
Model-based probability forecasts can vary significantly with alterations in the method of data assimilation, ensemble formation, ensemble interpretation, and forecast evaluation, not to mention questions of model structure, parameter selection and the available forecast-outcome archive. The role of each of these aspects of forecasting, in the context of interpreting the forecast as a real-world probability, is considered and contrasted in the cases of weather forecasting, climate forecasting, and economic forecasting. The notion of what makes a probability forecast "good" will be discussed, including the goals of "sharpness given calibration" and "value".
For a probability forecast to be decision-relevant as such, it must be reasonably interpreted as a basis for rational action through the reflection of the probability of the outcomes forecast. This rather obvious sounding requirement proves to be the source of major discomfort as the distinct roles of uncertainty (imprecision) and error (structural mathematical "misspecification") are clarified. Probabilistic forecasts can be of value to decision makers even when it is irrational to interpret them as probability forecasts. A similar statement, of course, can be said for point forecasts, or for spin. In this context we explore the question: do decision-relevant probability forecasts exist?
Arguing about risks: a request for assistance
Abstract
The standard mathematical treatment of risk combines numerical measures of uncertainty (usually probabilistic) and loss (money and other natural estimators of utility). There are significant practical and theoretical problems with this interpretation. A particular concern is that the estimation of quantitative parameters is frequently problematic, particularly when dealing with one-off events such as political, economic or environmental disasters. Practical decision-making under risk, therefore, frequently requires extensions to the standard treatment.
An intuitive approach to reasoning under uncertainty has recently become established in computer science and cognitive science in which general theories (formalised in a non-classical first-order logic) are applied to descriptions of specific situations in order to construct arguments for and/or against claims about possible events. Collections of arguments can be aggregated to characterize the type or degree of risk, using the logical grounds of the arguments to explain, and assess the credibility of, the supporting evidence for competing claims. Discussions about whether a complex piece of equipment or software could fail, the possible consequences of such failure and their mitigation, for example, can be based on the balance and relative credibility of all the arguments. This approach has been shown to offer versatile risk management tools in a number of domains, including clinical medicine and toxicology (e.g. www.infermed.com; www.lhasa.com). Argumentation frameworks are also being used to support open discussion and debates about important issues (e.g. see debate on environmental risks at www.debategraph.org).
Despite the practical success of argument-based methods for risk assessment and other kinds of decision making they typically ignore measurement of uncertainty even if some quantitative data are available, or combine logical inference with quantitative uncertainty calculations in ad hoc ways. After a brief introduction to the argumentation approach I will demonstrate medical risk management applications of both kinds and invite suggestions for solutions which are mathematically more satisfactory.
Definitions (Hubbard: http://en.wikipedia.org/wiki/Risk)
Uncertainty: The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.
Measurement of uncertainty: A set of probabilities assigned to a set of possibilities. Example:"There is a 60% chance this market will double in five years"
Risk: A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.
Measurement of risk: A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of $12 million in exploratory drilling costs".
The conceptual background to the argumentation approach to reasoning under uncertainty is reviewed in the attached paper “Arguing about the Evidence: a logical approach”.
11:00
Rossby wave dynamics of the extra-tropical response to El Nino
Ocean Eddy Parameterisation and Conservation Principles
Abstract
Ocean climate models are unlikely routinely to have sufficient
resolution to resolve the turbulent ocean eddy field. The need for the
development of improved mesoscale eddy parameterisation schemes
therefore remains an important task. The current dominant mesoscale eddy
closure is the Gent and McWilliams scheme, which enforces the
down-gradient mixing of buoyancy. While motivated by the action of
baroclinic instability on the mean flow, this closure neglects the
horizontal fluxes of horizontal momentum. The down-gradient mixing of
potential vorticity is frequently discussed as an alternative
parameterisation paradigm. However, such a scheme, without careful
treatment, violates fundamental conservation principles, and in
particular violates conservation of momentum.
A new parameterisation framework is presented which preserves
conservation of momentum by construction, and further allows for
conservation of energy. The framework has one dimensional parameter, the
total eddy energy, and five dimensionless and bounded geometric
parameters. The popular Gent and McWilliams scheme exists as a limiting
case of this framework. Hence the new framework enables for the
extension of the Gent and McWilliams scheme, in a manner consistent with
key physical conservations.