Mon, 06 Nov 2006
12:00
L3

Quantizing BPS Black Holes in 4 Dimensions

Boris Pioline
(Universite Paris VI et VII and ENS)
Abstract

\\Common\dfs\htdocs\www\maintainers\reception\enb\abstracts\string-theory\mt06\pioline.shtml

Fri, 03 Nov 2006
16:30
L2

Three Eras of Aggregation Kinetics

Professor John Neu
(Berkeley, USA)
Abstract

Aggregation refers to the thermodynamically favoured coalescence of individual molecular units (monomers) into dense clusters. The formation of liquid drops in oversaturated vapour, or the precipitation of solids from liquid solutions, are 'everyday' examples. A more exotic example, the crystallization of hydrophobic proteins in lipid bilayers, comes from current biophysics.

This talk begins with the basic physics of the simplest classical model, in which clusters grow by absorbing or expelling monomers, and the free monomers are transported by diffusion. Next, comes the description of three successive 'eras' of the aggregation process: NUCLEATION is the initial creation of clusters whose sizes are sufficiently large that they most likely continue to grow, instead of dissolving back into monomers.

The essential physical idea is growth by unlikely fluctuations past a high free energy barrier. The GROWTH of the clusters after nucleation depletes the initial oversaturation of monomer. The free energy barrier against nucleation increases, effectively shutting off any further nucleation. Finally, the oversaturation is so depleted, that the largest clusters grow only by dissolution of the smallest. This final era is called COARSENING.

The initial rate of nucleation and the evolution of the cluster size distribution during coarsening are the subjects of classical, well known models. The 'new meat' of this talk is a 'global' model of aggregation that quantitates the nucleation era, and provides an effective initial condition for the evolution of the cluster size distribution during growth and coarsening. One by-product is the determination of explicit scales of time and cluster size for all three eras. In particular, if G_* is the initial free energy barrier against nucleation, then the characteristic time of the nucleation era is proportional to exp(2G_*/5k_bT), and the characteristic number of monomers in a cluster during the nucleation era is exp(3G_*/5k_bT). Finally, the 'global' model of aggregation informs the selection of the self similar cluster size distribution that characterizes 'mature' coarsening.

Thu, 02 Nov 2006
16:30
DH 1st floor SR

Granular Mechanics

George Mullenger
(University of Canterbury, NZ)
Thu, 02 Nov 2006

14:00 - 15:00
Comlab

Multivariate highly oscillatory integration

Mr Sheehan Olver
(University of Cambridge)
Abstract

The aim of this talk is to describe several methods for numerically approximating

the integral of a multivariate highly oscillatory function. We begin with a review

of the asymptotic and Filon-type methods developed by Iserles and Nørsett. Using a

method developed by Levin as a point of departure we will construct a new method that

uses the same information as the Filon-type method, and obtains the same asymptotic

order, while not requiring moments. This allows us to integrate over nonsimplicial

domains, and with complicated oscillators.

Tue, 31 Oct 2006
17:00
L1

Phan theory

Prof. S. Shpectorov
(University of Birmingham)
Mon, 30 Oct 2006
15:45
L3

Topology of moduli spaces I

Ulrike Tillmann
Abstract

1. Introduction and survey of the cohomological results

This will be a relatively gentle introduction to the topologist's point of view of Riemann's moduli space followed by a description of its rational and torsion cohomology for large genus.

Mon, 30 Oct 2006
14:15
DH 3rd floor SR

The ensemble Kalman filter: a state estimation method for hazardous weather prediction

Dr Sarah Dance
(University of Reading)
Abstract
Numerical weather prediction models require an estimate of the current state of the atmosphere as an initial condition. Observations only provide partial information, so they are usually combined with prior information, in a process called data assimilation. The dynamics of hazardous weather such as storms is very nonlinear, with only a short predictability timescale, thus it is important to use a nonlinear, probabilistic filtering method to provide the initial conditions. 

Unfortunately, the state space is very large (about 107 variables) so approximations have to be made.

The Ensemble Kalman filter (EnKF) is a quasi-linear filter that has recently been proposed in the meteorological and oceanographic literature to solve this problem. The filter uses a forecast ensemble (a Monte Carlo sample) to estimate the prior statistics. In this talk we will describe the EnKF framework and some of its strengths and weaknesses. In particular we will demonstrate a new result that not all filters of this type bear the desired relationship to the forecast ensemble: there can be a systematic bias in the analysis ensemble mean and consequently an accompanying shortfall in the spread of the analysis ensemble as expressed by the ensemble covariance matrix. This points to the need for a restricted version of the notion of an EnKF. We have established a set of necessary and sufficient conditions for the scheme to be unbiased. Whilst these conditions are not a cure-all and cannot deal with independent sources of bias such as modelling errors, they should be useful to designers of EnKFs in the future.

/notices/events/abstracts/stochastic-analysis/mt06/dance.shtml

 

 

Thu, 26 Oct 2006

14:00 - 15:00
Comlab

Supercomputing at Oxford

Dr Anne Trefethen
(OeRC)
Abstract

High-performance computing is an important tool for computational science.

Oxford University has recently decided to invest £3M in a new supercomputing

facility which is under development now. In this seminar I will give an overview

of some activities in Oxford and provide a vision for supercomputing here.

I will discuss some of the numerical analysis software and tools,

such as Distributed Matlab and indicate some of the challenges at

the intersection of numerical analysis and high-performance computing.

Mon, 23 Oct 2006
14:15
DH 3rd floor SR

Dual Nonlinear Filters and Entropy Production

Dr Nigel Newton
(University of Essex)
Abstract
The talk will describe recent collaborative work between the speaker and Professor Sanjoy Mitter of MIT on connections between continuous-time nonlinear filtering theory, and nonequilibrium statistical mechanics. The study of nonlinear filters from a (Shannon) information- theoretic viewpoint reveals two flows of information, dubbed 'supply' and 'dissipation'. These characterise, in a dynamic way, the dependencies between the past, present and future of the signal and observation processes. In addition, signal and nonlinear filter processes exhibit a number of symmetries, (in particular they are jointly and marginally Markov), and these allow the construction of dual filtering problems by time reversal. The information supply and dissipation processes of a dual problem have rates equal to those of the original, but with supply and dissipation exchanging roles. The joint (signal-filter) process of a nonlinear filtering problem is unusual among Markov processes in that it exhibits one-way flows of information between components. The concept of entropy flow in the stationary distribution of a Markov process is at the heart of a modern theory of nonequilibrium statistical mechanics, based on stochastic dynamics. In this, a rate of entropy flow is defined by means of time averages of stationary ergodic processes. Such a definition is inadequate in the dynamic theory of nonlinear filtering. Instead a rate of entropy production can be defined, which is based on only the (current) local characteristics of the Markov process. This can be thought of as an 'entropic derivative'. The rate of entropy production of the joint process of a nonlinear filtering problem contains an 'interactive' component equal to the sum of the information supply and dissipation rates. These connections between nonlinear filtering and statistical mechanics allow a certain degree of cross- fertilisation between the fields. For example, the nonlinear filter, viewed as a statistical mechanical system, is a type of perpetual motion machine, and provides a precise quantitative example of Landauer's Principle. On the other hand, the theory of dissipative statistical mechanical systems can be brought to bear on the study of sub-optimal filters. On a more philosophical level, we might ask what a nonlinear filter can tell us about the direction of thermodynamic time.    
Mon, 23 Oct 2006
12:00
L3

Einstein Geometry and Conformal Field Theory

James Sparks
(Oxford)
Abstract
I shall describe two recent results in Sasaki-Einstein geometry, which is the odd-dimensional cousin of Kahler-Einstein geometry, and how they are related to four-dimensional superconformal field theory (SCFT) via the AdS/CFT correspondence. The first is a proof that the volumes of such Einstein manifolds are always algebraic numbers, which reflects a similar statement about central charges in SCFTs due to Intriligator and Wecht. The second descibes two simple holomorphic obstructions to the existence of such Einstein metrics. In such obstructed cases the non-existence of the dual superconformal fixed point may be proven by a simle application of the unitarity bound and the “a-theorem”, respectively, and these may be related directly to the geometrical obstructions via AdS/CFT arguments. On the mathematical side, these are new simple obstructions to the existence of Kahler-Einstein metrics on Fano orbifolds.

/notices/events/abstracts/string-theory/mt06/sparks.shtml