We propose a new flexible unified framework for studying the time consistency property suited for a large class of maps defined on the set of all cash flows and that are postulated to satisfy only two properties -- monotonicity and locality. This framework integrates the existing forms of time consistency for dynamic risk measures and dynamic performance measures (also known as acceptability indices). The time consistency is defined in terms of an update rule, a novel notion that would be discussed into details and illustrated through various examples. Finally, we will present some connections between existing popular forms of time consistency.

This is a joint work with Tomasz R. Bielecki and Marcin Pitera.

# Past Mathematical and Computational Finance Seminar

We develop a Bayesian methodology for systemic risk assessment in financial networks such as the interbank market. Nodes represent participants in the network and weighted directed edges represent liabilities. Often, for every participant, only the total liabilities and total assets within this network are observable. However, systemic risk assessment needs the individual liabilities. We propose a model for the individual liabilities, which, following a Bayesian approach, we then condition on the observed total liabilities and assets and, potentially, on certain observed individual liabilities. We construct a Gibbs sampler to generate samples from this conditional distribution. These samples can be used in stress testing, giving probabilities for the outcomes of interest. As one application we derive default probabilities of individual banks and discuss their sensitivity with respect to prior information included to model the network. An R-package implementing the methodology is provided. (This is joint work with Axel Gandy (Imperial College London).)

We propose a randomised version of the Heston model--a widely used stochastic volatility model in mathematical finance--assuming that the starting point of the variance process is a random variable. In such a system, we study the small- and large-time behaviours of the implied volatility, and show that the proposed randomisation generates a short-maturity smile much steeper (`with explosion') than in the standard Heston model, thereby palliating the deficiency of classical stochastic volatility models in short time. We precisely quantify the speed of explosion of the smile for short maturities in terms of the right tail of the initial distribution, and in particular show that an explosion rate of $t^\gamma$ (gamma in [0,1/2]) for the squared implied volatility--as observed on market data--can be obtained by a suitable choice of randomisation. The proofs are based on large deviations techniques and the theory of regular variations. Joint work with Fangwei Shi (Imperial College London)

We introduce sufficient conditions for the solution of a multi-dimensional, Markovian BSDE to have a density. We show that a system of BSDEs possesses a density if its corresponding semilinear PDE exhibits certain regularity properties, which we verify in the case of several examples.

Backward SDEs have proven to be a useful tool in mathematical finance. Their applications include the solution to various pricing and equilibrium problems in complete and incomplete markets, the estimation of value adjustments in the presence of funding costs, and the solution to many utility/risk optimisation type of problems.

In this work, we prove an explicit error expansion for the approximation of BSDEs. We focus our work on studying the cubature method of solution. To profit fully from these expansions in this case, e.g. to design high order approximation methods, we need in addition to control the complexity growth of the base algorithm. In our work, this is achieved by using a sparse grid representation. We present several numerical results that confirm the efficiency of our new method. Based on joint work with J.F. Chassagneux.

We show how to adapt methods originally developed in

model-independent finance / martingale optimal transport to give a

geometric description of optimal stopping times tau of Brownian Motion

subject to the constraint that the distribution of tau is a given

distribution. The methods work for a large class of cost processes.

(At a minimum we need the cost process to be adapted. Continuity

assumptions can be used to guarantee existence of solutions.) We find

that for many of the cost processes one can come up with, the solution

is given by the first hitting time of a barrier in a suitable phase

space. As a by-product we thus recover Anulova's classical solution of

the inverse first passage time problem.