Forthcoming events in this series


Fri, 04 Jun 2010
14:15
DH 1st floor SR

An overview of some recent progress in incomplete-market equilibria

Gordan Zitkovic
(UT Austin)
Abstract

In addition to existence, the excess-demand approach allows us to establish uniqueness and provide efficient computational algorithms for various complete- and incomplete-market stochastic financial equilibria.

A particular attention will be paid to the case when the agents exhibit constant absolute risk aversion. An overview of recent results (including those jointly obtained with M. Anthropelos and with Y. Zhao) will be given.

Fri, 21 May 2010
14:15
Oxford-Man Institute

A Non-Zero-Sum Game Approach to Convertible Bonds: Tax Benefit, Bankrupt Cost and Early/Late Calls

Nan Chen
(CUHK)
Abstract

Convertible bonds are hybrid securities that embody the characteristics of both straight bonds and equities. The conflict of interests between bondholders and shareholders affects the security prices significantly. In this paper, we investigate how to use a non-zero-sum game framework to model the interaction between bondholders and shareholders and to evaluate the bond accordingly. Mathematically, this problem can be reduced to a system of variational inequalities. We explicitly derive a unique Nash equilibrium to the game.

Our model shows that credit risk and tax benefit have considerable impacts on the optimal strategies of both parties. The shareholder may issue a call when the debt is in-the-money or out-of-the-money. This is consistent with the empirical findings of “late and early calls"

(Ingersoll (1977), Mikkelson (1981), Cowan et al. (1993) and Ederington et al. (1997)). In addition, the optimal call policy under our model offers an explanation for certain stylized patterns related to the returns of company assets and stock on calls.

 

Fri, 21 May 2010
12:45
Oxford-Man Institute

Forced Sales and House Prices"

John Campell
(Harvard University)
Abstract

This paper uses data on house transactions in the state of Massachusetts over the last 20 years

to show that houses sold after foreclosure, or close in time to the death or bankruptcy of at least

one seller, are sold at lower prices than other houses. Foreclosure discounts are particularly large on

average at 27% of the value of a house. The pattern of death-related discounts suggests that they may

result from poor home maintenance by older sellers, while foreclosure discounts appear to be related

to the threat of vandalism in low-priced neighborhoods. After aggregating to the zipcode level and

controlling for regional price trends, the prices of forced sales are mean-reverting, while the prices

of unforced sales are close to a random walk. At the zipcode level, this suggests that unforced sales

take place at approximately ecient prices, while forced-sales prices re

ect time-varying illiquidity in

neighborhood housing markets. At a more local level, however, we nd that foreclosures that take

place within a quarter of a mile, and particularly within a tenth of a mile, of a house lower the price

at which it is sold. Our preferred estimate of this eect is that a foreclosure at a distance of 0.05 miles

lowers the price of a house by about 1%.

Fri, 14 May 2010
14:15
DH 1st floor SR

Hybrid Switching Diffusions and Applications to Stochastic Controls

George Yin
(Wayne State)
Abstract

In this talk, we report some of our recent work on hybrid switching diffusions in which continuous dynamics and discrete events coexist. Motivational examples in singular perturbed Markovian systems, manufacturing, and financial engineering will be mentioned. After presenting criteria for recurrence and ergodicity, we consider numerical methods for controlled switching diffusions and related game problems. Rates of convergence of Markov chain approximation methods will also be studied.

Fri, 07 May 2010
14:15
DH 1st floor SR

Efficiency for the concave Order and Multivariate

Dana Rose-Anne (Joint With OMI)
(Dauphine)
Abstract

comonotonicity joint work with Carlier and Galichon Abstact This paper studies efficient risk-sharing rules for the concave dominance order. For a univariate risk, it follows from a \emph{comonotone dominance principle}, due to Landsberger and

Meilijson that efficiency is

characterized by a comonotonicity condition. The goal of the paper is to generalize the comonotone dominance principle as well as the equivalence between efficiency and comonotonicity to the multi-dimensional case. The multivariate case is more involved (in particular because there is no immediate extension of the notion of comonotonicity) and it is addressed by using techniques from convex duality and optimal transportation.

Fri, 30 Apr 2010
14:15
DH 1st floor SR

Numerical Approximation and BSDE representation for Switching Problems

Romuald Elie
(Dauphine)
Abstract

Hamadène and Jeanblanc provided a BSDE representation for the resolution of bi-dimensional continuous time optimal switching problems. For example, an energy producer faces the possibility to switch on or off a power plant depending on the current price of electricity and corresponding comodity. A BSDE representation via multidimensional reflected BSDEs for this type of problems in dimension larger than 2 has been derived by Hu and Tang as well as Hamadène and Zhang [2]. Keeping the same example in mind, one can imagine that the energy producer can use different electricity modes of production, and switch between them depending on the commodity prices. We propose here an alternative BSDE representation via the addition of constraints and artificial jumps. This allows in particular to reinterpret the solution of multidimensional reflected BSDEs in terms of one-dimensional constrained BSDEs with jumps. We provide and study numerical schemes for the approximation of these two type of BSDEs

Fri, 12 Mar 2010
14:15
DH 1st floor SR

Financial Markets with Uncertain Volatility

Mete Soner
Abstract

 Abstract.  Even in simple models in which thevolatility is only known to stay in two bounds, it is quite hard to price andhedge derivatives which are not Markovian.  The main reason for thisdifficulty emanates from the fact that the probability measures are singular toeach other.  In this talk we will prove a martingale representation theoremfor this market.  This result provides a complete answer to the questionsof hedging and pricing.  The main tools are the theory of nonlinearG-expectations as developed by Peng, the quasi-sure sto chastic artini and thesecond order backward stochastic differential equations.

 This is jointwork with Nizar Touzi from Ecole Polytechnique and Jianfeng Zhang fromUniversity of Southern California.

 

Fri, 05 Mar 2010
14:15
L1

Finite Resource Valuations: Myths, Theory and Practise

Geoff Evatt
Abstract

Abstract: The valuation of a finite resource, be it acopper mine, timber forest or gas field, has received surprisingly littleattention from the academic literature. The fact that a robust, defensible andaccurate valuation methodology has not been derived is due to a mixture ofdifficulty in modelling the numerous stochastic uncertainties involved and thecomplications with capturing real day-to-day mining operations. The goal ofproducing such valuations is not just for accounting reasons, but also so thatoptimal extraction regimes and procedures can be devised in advance for use atthe coal-face. This paper shows how one can begin to bring all these aspectstogether using contingent claims financial analysis, geology, engineering,computer science and applied mathematics.

Tue, 23 Feb 2010
14:15
DH 1st floor SR

Stopping with Multiple Priors and Variational Expectations in Contiuous Time

Frank Riedel
(Bielefeld University)
Abstract

We develop a theory of optimal stopping problems under ambiguity in continuous time. Using results from (backward) stochastic calculus, we characterize the value function as the smallest (nonlinear) supermartingale dominating the payoff process. For Markovian models, we derive a Hamilton–Jacobi–Bellman equation involving a nonlinear drift term that describes the agent’s ambiguity aversion. We show how to use these general results for search problems and American Options.

Fri, 12 Feb 2010
14:15
L1

Order book resilience, price manipulation, and Fredholm integral equations

Alexander Scheid
Abstract

The viability of a market impact model is usually considered to be equivalent to the absence of price manipulation strategies in the sense of Huberman & Stanzl (2004). By analyzing a model with linear instantaneous, transient, and permanent impact components, we discover a new class of irregularities, which we call transaction-triggered price manipulation strategies. Transaction-triggered price manipulation is closely related to the non-existence of measure-valued solutions to a Fredholm integral equation of the first kind. We prove that price impact must decay as a convex decreasing function of time to exclude these market irregularities along with standard price manipulation. We also prove some qualitative properties of optimal strategies and provide explicit expressions for the optimal strategy in several special cases of interest. Joint work with Aurélien Alfonsi, Jim Gatheral, and Alla Slynko.

Fri, 05 Feb 2010

11:00 - 12:00
Oxford-Man Institute

Rollover Risk and Credit Risk

Wei Xiong
(Princeton University)
Abstract

This paper models a firm’s rollover risk generated by con.ict of interest between debt and equity holders. When the firm faces losses in rolling over its maturing debt, its equity holders are willing to absorb the losses only if the option value of keeping the firm alive justifies the cost of paying off the maturing debt. Our model shows that both deteriorating market liquidity and shorter debt maturity can exacerbate this externality and cause costly firm bankruptcy at higher fundamental thresholds. Our model provides implications on liquidity- spillover effects, the flight-to-quality phenomenon, and optimal debt maturity structures.

Fri, 22 Jan 2010
14:15
DH 1st floor SR

Optimal Control Under Stochastic Target Constraints

Bruno Bouchard
(University Paris Dauphine)
Abstract
/*-->*/ /*-->*/

We study a class of Markovian optimal stochastic control problems in which the controlled process $Z^\nu$ is constrained to satisfy an a.s.~constraint $Z^\nu(T)\in G\subset \R^{d+1}$ $\Pas$ at some final time $T>0$.  When the set is of the form $G:=\{(x,y)\in \R^d\x \R~:~g(x,y)\ge 0\}$, with $g$ non-decreasing in $y$, we provide a Hamilton-Jacobi-Bellman  characterization of the associated value function. It gives rise to a state constraint problem where the constraint can be expressed in terms of an auxiliary value function $w$ which characterizes the set $D:=\{(t,Z^\nu(t))\in [0,T]\x\R^{d+1}~:~Z^\nu(T)\in G\;a.s.$ for some $ \nu\}$. Contrary to standard state constraint problems, the domain $D$ is not given a-priori and we do not need to impose conditions on its boundary. It is naturally incorporated in the auxiliary value function $w$ which is itself a viscosity solution of a non-linear parabolic PDE.  Applying ideas recently developed in Bouchard, Elie and Touzi (2008), our general result also allows to consider optimal control problems with moment constraints of the form $\Esp{g(Z^\nu(T))}\ge 0$ or $\Pro{g(Z^\nu(T))\ge 0}\ge p$.

Fri, 04 Dec 2009
14:15
Eagle House

Robust utility maximization from terminal wealth and consumption considering a model with jumps : BSDE approach

Anis Matoussi
(Le Mans)
Abstract

We study a stochastic control problem in the context of utility maximization under model uncertainty. The problem is formulated as /max min/ problem : /max /over strategies and consumption and /min/ over the set of models (measures).

For the minimization problem, we have showed in Bordigoni G., Matoussi,A., Schweizer, M. (2007) that there exists a unique optimal measure equivalent to the reference measure. Moreover, in the context of continuous filtration, we characterize the dynamic value process of our stochastic control problem as the unique solution of a generalized backward stochastic differential equation with a quadratic driver. We extend first this result in a discontinuous filtration. Moreover, we obtain a comparison theorem and a regularity properties for the associated generalized BSDE with jumps, which are the key points in our approach, in order to solve the utility maximization problem over terminal wealth and consumption. The talk is based on joint work with M. Jeanblanc and A. Ngoupeyou (2009).

Fri, 27 Nov 2009
14:15
DH 1st floor SR

Pricing without equivalent martingale measures under complete and incomplete observation

Wolfgang Runggaldier
(Padova)
Abstract

Traditional arbitrage pricing theory is based on martingale measures. Recent studies show that some form of arbitrage may exist in real markets implying that then there does not exist an equivalent martingale measure and so the question arises: what can one do with pricing and hedging in this situation? We mention here two approaches to this effect that have appeared in the literature, namely the ``Fernholz-Karatzas" approach and Platen's "Benchmark approach" and discuss their relationships both in models where all relevant quantities are fully observable as well as in models where this is not the case and, furthermore, not all observables are also investment instruments.

[The talk is based on joint work with former student Giorgia Galesso]

Fri, 20 Nov 2009
14:15
DH 1st floor SR

On portfolio optimization with transaction costs - a "new" approach

Jan Kallsen
(Kiel)
Abstract

We reconsider Merton's problem under proportional transaction costs.

Beginning with Davis and Norman (1990) such utility maximization problems are usually solved using stochastic control theory.

Martingale methods, on the other hand, have so far only been used to derive general structural results. These apply the duality theory for frictionless markets typically to a fictitious shadow price process lying within the bid-ask bounds of the real price process.

In this study we show that this dual approach can actually be used for both deriving a candidate solution and verification.

In particular, the shadow price process is determined explicitly.

Fri, 13 Nov 2009
14:15
DH 1st floor SR

Clustered Default

Jin-Chuan Duan
(National University of Singapore)
Abstract

Defaults in a credit portfolio of many obligors or in an economy populated with firms tend to occur in waves. This may simply reflect their sharing of common risk factors and/or manifest their systemic linkages via credit chains. One popular approach to characterizing defaults in a large pool of obligors is the Poisson intensity model coupled with stochastic covariates, or the Cox process for short. A constraining feature of such models is that defaults of different obligors are independent events after conditioning on the covariates, which makes them ill-suited for modeling clustered defaults. Although individual default intensities under such models can be high and correlated via the stochastic covariates, joint default rates will always be zero, because the joint default probabilities are in the order of the length of time squared or higher. In this paper, we develop a hierarchical intensity model with three layers of shocks -- common, group-specific and individual. When a common (or group-specific) shock occurs, all obligors (or group members) face individual default probabilities, determining whether they actually default. The joint default rates under this hierarchical structure can be high, and thus the model better captures clustered defaults. This hierarchical intensity model can be estimated using the maximum likelihood principle. A default signature plot is invented to complement the typical power curve analysis in default prediction. We implement the new model on the US corporate bankruptcy data and find it far superior to the standard intensity model both in terms of the likelihood ratio test and default signature plot.

Fri, 30 Oct 2009
14:15
DH 1st floor SR

Jump-Diffusion Risk-Sensitive Asset Management Mark H.A. Davis, Sebastien Lleo

Mark Davis
(Imperial)
Abstract

This paper considers a portfolio optimization problem in which asset prices are represented by SDEs driven by Brownian motion and a Poisson random measure, with drifts that are functions of an auxiliary diffusion 'factor' process. The criterion, following earlier work by Bielecki, Pliska, Nagai and others, is risk-sensitive optimization (equivalent to maximizing the expected growth rate subject to a constraint on variance.) By using a change of measure technique introduced by Kuroda and Nagai we show that the problem reduces to solving a certain stochastic control problem in the factor process, which has no jumps. The main result of the paper is that the Hamilton-Jacobi-Bellman equation for this problem has a classical solution. The proof uses Bellman's "policy improvement"

method together with results on linear parabolic PDEs due to Ladyzhenskaya et al. This is joint work with Sebastien Lleo.

Fri, 23 Oct 2009
14:15
DH 1st floor SR

Stochastic version of the rule "Buy and Hold"

Albert Shiryaev
(Steklov)
Abstract

For a logarithmic utility function we extend our rezult with Xu and Zhou for case of the geometrical Brownian motion with drift term which depends of the some hidden parameter.

Fri, 16 Oct 2009
14:15
DH 1st floor SR

The Mean-Variance Hedging and Exponential Utility in a Bond Market With Jumps

Michael Kohlmann
(Konstanz)
Abstract

We construct a market of bonds with jumps driven by a general marked point

process as well as by an Rn-valued Wiener process, in which there exists at least one equivalent

martingale measure Q0. In this market we consider the mean-variance hedging of a contingent

claim H 2 L2(FT0) based on the self-financing portfolios on the given maturities T1, · · · , Tn

with T0 T. We introduce the concept of variance-optimal martingale

(VOM) and describe the VOM by a backward semimartingale equation (BSE). We derive an

explicit solution of the optimal strategy and the optimal cost of the mean-variance hedging by

the solutions of two BSEs.

The setting of this problem is a bit unrealistic as we restrict the available bonds to those

with a a pregiven finite number of maturities. So we extend the model to a bond market with

jumps and a continuum of maturities and strategies which are Radon measure valued processes.

To describe the market we consider the cylindrical and normalized martingales introduced by

Mikulevicius et al.. In this market we the consider the exp-utility problem and derive some

results on dynamic indifference valuation.

The talk bases on recent common work with Dewen Xiong.

Fri, 19 Jun 2009
14:15
DH 1st floor SR

Market Closure, Portfolio Selection, and Liquidity Premia

Hong Liu, with Min Dai and Peifan Li.
(Washington U St Louis)
Abstract

Constantinides (1986) finds that transaction cost has only a second order effect on liquidity premia. In this paper, we show that simply incorporating the well-established time-varying return dynamics across trading and nontrading periods generates a first order effect that is much greater than that found by the existing literature and comparable to empirical evidence. Surprisingly, the higher liquidity premium is Not from higher trading frequency, but mainly from the substantially suboptimal (relative to the no transaction case) trading strategy chosen to control transaction costs. In addition, we show that adopting strategies prescribed by standard models that assume a continuously open market and constant return dynamics can result in significant utility loss. Furthermore, our model predicts that trading volume is greater at market close and market open than the rest of trading times.

Wed, 17 Jun 2009
12:00
Oxford-Man Institute

Local Variance Gamma - (EXTRA SEMINAR)

Peter Carr
(Bloomberg - Quantitative Financial Research)
Abstract

In some options markets (eg. commodities), options are listed with only a single maturity for each underlying.

In others, (eg. equities, currencies),

options are listed with multiple maturities.

In this paper, we assume that the risk-neutral process for the underlying futures price is a pure jump Markov martingale and that European option prices are given at a continuum of strikes and at one or more maturities. We show how to construct a time-homogeneous process which meets a single smile and a piecewise time-homogeneous process, which can meet multiple smiles.

We also show that our construction leads to partial differential difference equations (PDDE's), which permit both explicit calibration and fast numerical valuation