Nucleation, Bubble Growth and Coalescence
Abstract
In gas-liquid two-phase pipe flows, flow regime transition is associated with changes in the micro-scale geometry of the flow. In particular, the bubbly-slug transition is associated with the coalescence and break-up of bubbles in a turbulent pipe flow. We consider a sequence of models designed to facilitate an understanding of this process. The simplest such model is a classical coalescence model in one spatial dimension. This is formulated as a stochastic process involving nucleation and subsequent growth of ‘seeds’, which coalesce as they grow. We study the evolution of the bubble size distribution both analytically and numerically. We also present some ideas concerning ways in which the model can be extended to more realistic two- and three-dimensional geometries.
Using signatures to predict amyotrophic lateral sclerosis progression
Abstract
Medical data often comes in multi-modal, streamed data. The challenge is to extract useful information from this data in an environment where gathering data is expensive. In this talk, I show how signatures can be used to predict the progression of the ALS disease.
Detection of Transient Data using the Signature Features
Abstract
In this talk, we consider the supervised learning problem where the explanatory variable is a data stream. We provide an approach based on identifying carefully chosen features of the stream which allows linear regression to be used to characterise the functional relationship between explanatory variables and the conditional distribution of the response; the methods used to develop and justify this approach, such as the signature of a stream and the shuffle product of tensors, are standard tools in the theory of rough paths and provide a unified and non-parametric approach with potential significant dimension reduction. We apply it to the example of detecting transient datasets and demonstrate the superior effectiveness of this method benchmarked with supervised learning methods with raw data.
Topological adventures in neuroscience
Abstract
Over the past decade, and particularly over the past five years, research at the interface of topology and neuroscience has grown remarkably fast. In this talk I will briefly survey a few quite different applications of topology to neuroscience in which members of my lab have been involved over the past four years: the algebraic topology of brain structure and function, topological characterization and classification of neuron morphologies, and (if time allows) topological detection of network dynamics.
Uniqueness and stability for shock reflection problem
Abstract
We discuss shock reflection problem for compressible gas dynamics, von Neumann conjectures on transition between regular and Mach reflections, and existence of regular reflection solutions for potential flow equation. Then we will talk about recent results on uniqueness and stability of regular reflection solutions for potential flow equation in a natural class of self-similar solutions. The approach is to reduce the shock reflection problem to a free boundary problem for a nonlinear elliptic equation, and prove uniqueness by a version of method of continuity. A property of solutions important for the proof of uniqueness is convexity of the free boundary.
This talk is based on joint works with G.-Q. Chen and W. Xiang.
Higher Regularity of the p-Poisson Equation in the Plane
Abstract
In recent years it has been discovered that also non-linear, degenerate equations like the $p$-Poisson equation $$ -\mathrm{div}(A(\nabla u))= - \mathrm{div} (|\nabla u|^{{p-2}}\nabla u)= -{\rm div} F$$ allow for optimal regularity. This equation has similarities to the one of power-law fluids. In particular, the non-linear mapping $F \mapsto A(\nabla u)$ satisfies surprisingly the linear, optimal estimate $\|A(\nabla u)\|_X \le c\, \|F\|_X$ for several choices of spaces $X$. In particular, this estimate holds for Lebesgue spaces $L^q$ (with $q \geq p'$), spaces of bounded mean oscillations and Holder spaces$C^{0,\alpha}$ (for some $\alpha>0$).
In this talk we show that we can extend this theory to Sobolev and Besov spaces of (almost) one derivative. Our result are restricted to the case of the plane, since we use complex analysis in our proof. Moreover, we are restricted to the super-linear case $p \geq 2$, since the result fails $p < 2$. Joint work with Anna Kh. Balci, Markus Weimar.
Nonlinear low-rank matrix completion
Abstract
The talk introduces the problem of completing a partially observed matrix whose columns obey a nonlinear structure. This is an extension of classical low-rank matrix completion where the structure is linear. Such matrices are in general full rank, but it is often possible to exhibit a low rank structure when the data is lifted to a higher dimensional space of features. The presence of a nonlinear lifting makes it impossible to write the problem using common low-rank matrix completion formulations. We investigate formulations as a nonconvex optimisation problem and optimisation on Riemannian manifolds.