Cover’s celebrated theorem states that the long run yield of a properly chosen “universal” portfolio is as good as the long run yield of the best retrospectively chosen constant rebalanced portfolio. We formulate an abstract principle behind such a universality phenomenon valid for general optimization problems in the long run. This allows to obtain new results on modelfree portfolio optimization, in particular in continuous time, involving larger classes of investment strategies. These modelfree results are complemented by a comparison with the log-optimal numeraire portfolio when fixing a stochastic model for the asset prices. The talk is based on joint work with Walter Schachermayer and Leonard Wong.

# Past Mathematical and Computational Finance Seminar

Thomas Piketty's influential book “Capital in the Twenty-First Century” documents the marked and unequivocal rise of income and wealth inequality observed across the developed world

in the last three decades. His extrapolations into the distant future are much more controversial and has

has been subject to various criticisms from both mainstreams and heterodox economists. This motivates the search for an alternative standpoint incorporating

heterodox insights such as endogenous money and the lessons from the Cambridge capital controversies. We argue that the Goodwin-Keen approach paves the road towards such an alternative.

We first consider a modified Goodwin-Keen model driven by consumption by households, instead of investment by firms, leading to the same qualitative features

of the original Keen 1995 model, namely the existence of an undesirable equilibrium characterized by infinite private debt ratio and zero employment,

in addition to a desirable one with finite debt and non-zero employment. By further subdividing the household sector into workers and investors, we are able to investigate their relative

income and wealth ratios for in the context of these two long-run equilibria, providing a testable link between asymptotic inequality and private debt accumulation.

We consider impatient decision makers when their assets' prices are in undesirable low regions for a significant amount of time, and they are risk averse to negative price jumps. We wish to study the unusual reactions of investors under such adverse market conditions. In mathematical terms, we study the optimal exercising of an American call option in a random time-horizon under spectrally negative Lévy models. The random time-horizon is modeled by an alarm of the so-called Omega default clock in insurance, which goes off when the cumulative amount of time spent by the asset price in an undesirable low region exceeds an independent exponential random time. We show that the optimal exercise strategies vary both quantitatively and qualitatively with the levels of impatience and nervousness of the investors, and we give a complete characterization of all optimal exercising thresholds.

The share of market making conducted by high-frequency trading (HFT) firms has been rising steadily. A distinguishing feature of HFTs is that they trade intraday, ending the day flat. To shed light on the economics of HFTs, and in a departure from existing market making theories, we model an HFT that has access to unlimited leverage intraday but must fund any end-of-day inventory at an exogenously determined cost. Even though the inventory costs only occur at the end of the day, they impact intraday price and liquidity dynamics. This gives rise to an intraday endogenous price impact mechanism. As time approaches the end of the trading day, the sensitivity of prices to inventory levels intensifies, making price impact stronger and widening bid-ask spreads. Moreover, imbalances of buy and sell orders may catalyze hikes and drops of prices, even under fixed supply and demand functions. Empirically, we show that these predictions are borne out in the U.S. Treasury market, where bid-ask spreads and price impact tend to rise towards the end of the day. Furthermore, price movements are negatively correlated with changes in inventory levels as measured by the cumulative net trading volume.

(based on joint work with Tobias Adrian, Erik Vogt, and Hongzhong Zhang)

We propose a new flexible unified framework for studying the time consistency property suited for a large class of maps defined on the set of all cash flows and that are postulated to satisfy only two properties -- monotonicity and locality. This framework integrates the existing forms of time consistency for dynamic risk measures and dynamic performance measures (also known as acceptability indices). The time consistency is defined in terms of an update rule, a novel notion that would be discussed into details and illustrated through various examples. Finally, we will present some connections between existing popular forms of time consistency.

This is a joint work with Tomasz R. Bielecki and Marcin Pitera.

We develop a Bayesian methodology for systemic risk assessment in financial networks such as the interbank market. Nodes represent participants in the network and weighted directed edges represent liabilities. Often, for every participant, only the total liabilities and total assets within this network are observable. However, systemic risk assessment needs the individual liabilities. We propose a model for the individual liabilities, which, following a Bayesian approach, we then condition on the observed total liabilities and assets and, potentially, on certain observed individual liabilities. We construct a Gibbs sampler to generate samples from this conditional distribution. These samples can be used in stress testing, giving probabilities for the outcomes of interest. As one application we derive default probabilities of individual banks and discuss their sensitivity with respect to prior information included to model the network. An R-package implementing the methodology is provided. (This is joint work with Axel Gandy (Imperial College London).)

We propose a randomised version of the Heston model--a widely used stochastic volatility model in mathematical finance--assuming that the starting point of the variance process is a random variable. In such a system, we study the small- and large-time behaviours of the implied volatility, and show that the proposed randomisation generates a short-maturity smile much steeper (`with explosion') than in the standard Heston model, thereby palliating the deficiency of classical stochastic volatility models in short time. We precisely quantify the speed of explosion of the smile for short maturities in terms of the right tail of the initial distribution, and in particular show that an explosion rate of $t^\gamma$ (gamma in [0,1/2]) for the squared implied volatility--as observed on market data--can be obtained by a suitable choice of randomisation. The proofs are based on large deviations techniques and the theory of regular variations. Joint work with Fangwei Shi (Imperial College London)