Mon, 09 Mar 2020
16:00
L4

A Minkowski problem and the Brunn-Minkowski inequality for nonlinear capacity

Murat Akman
(University of Essex)
Abstract


The classical Minkowski problem consists in finding a convex polyhedron from data consisting of normals to their faces and their surface areas. In the smooth case, the corresponding problem for convex bodies is to find the convex body given the Gauss curvature of its boundary, as a function of the unit normal. The proof consists of three parts: existence, uniqueness and regularity. 

 

In this talk, we study a Minkowski problem for certain measure, called p-capacitary surface area measure, associated to a compact convex set $E$ with nonempty interior and its $p-$harmonic capacitary function (solution to the p-Laplace equation in the complement of $E$).  If $\mu_p$ denotes this measure, then the Minkowski problem we consider in this setting is that; for a given finite Borel positive measure $\mu$ on $\mathbb{S}^{n-1}$, find necessary and sufficient conditions for which there exists a convex body $E$ with $\mu_p =\mu$. We will discuss the existence, uniqueness, and regularity of this problem which have deep connections with the Brunn-Minkowski inequality for p-capacity and Monge-Amp{\`e}re equation.

 

Fri, 04 Feb 2011

14:00 - 15:00
L1

Modelling and analysis of animal movement behaviour

Dr Edward Codling
(University of Essex)
Abstract

Mathematical modelling of the movement of animals, micro-organisms and cells is of great relevance in the fields of biology, ecology and medicine. Movement models can take many different forms, but the most widely used are based on extensions of simple random walk processes. In this talk I will review some of the basic ideas behind the theory of random walks and diffusion processes and discuss how these models are used in the context of modelling animal movement. I will present several case studies, each of which is an extension or application of some of the simple random walk ideas discussed previously. Specifically, I will consider problems related to biased and correlated movements, path analysis of movement data, sampling and processing issues and the problem of determining movement processes from observed patterns. I will also discuss some biological examples of how these models can be used, including chemosensory movements and interactions between zooplankton and the movements of fish.

Mon, 23 Oct 2006
14:15
DH 3rd floor SR

Dual Nonlinear Filters and Entropy Production

Dr Nigel Newton
(University of Essex)
Abstract
The talk will describe recent collaborative work between the speaker and Professor Sanjoy Mitter of MIT on connections between continuous-time nonlinear filtering theory, and nonequilibrium statistical mechanics. The study of nonlinear filters from a (Shannon) information- theoretic viewpoint reveals two flows of information, dubbed 'supply' and 'dissipation'. These characterise, in a dynamic way, the dependencies between the past, present and future of the signal and observation processes. In addition, signal and nonlinear filter processes exhibit a number of symmetries, (in particular they are jointly and marginally Markov), and these allow the construction of dual filtering problems by time reversal. The information supply and dissipation processes of a dual problem have rates equal to those of the original, but with supply and dissipation exchanging roles. The joint (signal-filter) process of a nonlinear filtering problem is unusual among Markov processes in that it exhibits one-way flows of information between components. The concept of entropy flow in the stationary distribution of a Markov process is at the heart of a modern theory of nonequilibrium statistical mechanics, based on stochastic dynamics. In this, a rate of entropy flow is defined by means of time averages of stationary ergodic processes. Such a definition is inadequate in the dynamic theory of nonlinear filtering. Instead a rate of entropy production can be defined, which is based on only the (current) local characteristics of the Markov process. This can be thought of as an 'entropic derivative'. The rate of entropy production of the joint process of a nonlinear filtering problem contains an 'interactive' component equal to the sum of the information supply and dissipation rates. These connections between nonlinear filtering and statistical mechanics allow a certain degree of cross- fertilisation between the fields. For example, the nonlinear filter, viewed as a statistical mechanical system, is a type of perpetual motion machine, and provides a precise quantitative example of Landauer's Principle. On the other hand, the theory of dissipative statistical mechanical systems can be brought to bear on the study of sub-optimal filters. On a more philosophical level, we might ask what a nonlinear filter can tell us about the direction of thermodynamic time.    
Subscribe to University of Essex