Winter school on Financial Mathematics 2003
Abstracts



Minicourses

Hélyette Geman: Stochastic Time Changes, Lévy Processes and Option Pricing

The classical Black-Scholes model assumes constant volatility and continuity of trajectories. From classical observations of the financial markets however, it is clear that these assumptions are not satisfied. Fat tails, volatility smile, jumps in skewness, in option prices are examples of deviations of market prices from the Black-Scholes assumptions.
A natural and parsimonious way of capturing the non constancy of volatility is to introduce a stochastic clock where time accelerates during periods of high volatility. From a financial perspective, the number of trades is exhibited to be the quantity driving the transaction time. From a mathematical perspective, it is shown that under no arbitrage, asset prices are time-changed Brownian motions. This representation allows to make a further case for jump processes to show the drawbacks of the Merton (1976) jump diffusion model, and to generate pure jump Lévy processes which include the CGMY (Carr - Geman - Madan - Yor) model. Stochastic volatility in the CGMY will finally be introduced (under the form of a time change) to allow for an excellent fit of the option volatility surface across strikes and maturities, using the characteristic function of the process and the fast Fourier transform in strike or the option prices.
Lastly, the topic of incomplete markets will be discussed and the paradigm of acceptable risk introduced; electricity markets will illustrate this discussion.
References
[1] Hélyette Geman, Pure Jump Lévy Processes for Asset Price Modelling, published in the Journal of Banking and Finance 26 (7), 1297-1316 (2002)
[2] Peter Carr, Hélyette Geman, Dilip B. Madan, Marc Yor, Stochastic Volatility for Lévy Processes

Paul Glasserman: Monte Carlo Methods in Risk Management

These lectures will review some recent developments in the application of Monte Carlo methods in finance.

Credit Risk: Here the challenge lies in accurate estimation of small probabilities of large losses (and associated risk measures). Importance sampling (IS) is a natural candidate for improving precision, but the application of IS is complicated by the types of dependence structures (e.g., normal copula) typically used in factor models of credit risk. We present two-part IS methods that change the distribution of the factors and increase default probabilities conditional on the factors in order to produce more scenarios with large losses. The methods are supported through asymptotics as both the portfolio size and loss threshold increase. We also discuss interactions between Monte Carlo and other computational methods.

Market Risk: The problem of estimating the value at risk of a large portfolio over a relatively short horizon can also be addressed using importance sampling, in this case based on a delta-gamma (or other quadratic) approximation to portfolio value. Extensions to heavy-tailed distributions are of special importance but present particular challenges to traditional IS strategies. We present methods to address these problems and discuss their asymptotic optimality properties. We also discuss the combination of IS with other techniques.

American Options: The pricing of American options by simulation is made difficult by the embedded optimal stopping problem. We give an overview of methods developed in recent years to address this problem. These methods apply weighted backward induction to simulated paths, with weights defined through likelihood ratios, through calibration, or implicitly through regression. We also discuss recent results on the convergence of these methods.
References
[1] Paul Glasserman, Monte Carlo Methods in Financial Engineering, Applications of Mathematics, Vol. 53, Springer, 2003
[2] P. Glasserman, Tail Approximations for Portfolio Credit Risk
[3] P. Glasserman and Bin Yu, Number of Paths Versus Number of Basis Functions in American Option Pricing
[4] P.Glasserman and Bin Yu, Simulation for American Options: Regression Now or Regression Later?, in Monte Carlo and Quasi-Monte Carlo Methods 2002, (H. Niederreiter, ed.), Springer, Berlin, 213-226.
[5] P. Glasserman and Jingyi Li, Importance Sampling for a Mixed Poisson Model of Portfolio Credit Risk, to appear in the Proceedings of the Winter Simulation Conference 2003
[6] Paul Glasserman, Philip Heidelberger and Perwez Shahabuddin, Portfolio value-at-risk with heavy-tailed risk factors

Special invited lectures

Rüdiger Frey: On Dynamic Models for Portfolio Credit Risk and Credit Contagion

It is by now well known that the performance of models for portfolio credit risk is very sensitive to the modelling of dependence between defaults of different obligors. In this talk we will be concerned with dynamic models for portfolios of dependent defaults. After a survey of existing approaches, we concentrate on models for credit contagion, i.e. models where the default of one company has a direct impact on the default intensity of other firms. We introduce a Markovian model and discuss the various types of interaction. Finally we present limit results for large portfolios in a homogeneous model with mean-field interaction and analyze the impact of credit contagion on the portfolio loss distribution.
References
[1] Slides of the lecture
[2] Rüdiger Frey and Jochen Backhaus, Interacting Defaults and Counterparty Risk: a Markovian Approach

Wolfgang Runggaldier: Estimation via stochastic filtering in financial market models

When specifying a financial market model one needs to specify also the model coefficients. The latter may be only partially known and so, in order to solve problems related to financial markets, it becomes important to exploit all the information coming from the market itself in order to update the knowledge of the not fully known quantities and this is where stochastic filtering becomes useful. The information from the market is not only given by the prices of the underlying primary assets, but also by the prices of the liquidly traded derivatives. A major problem in this context is that derivative prices are given as expectations under a martingale measure, while the actual observations take place under the real world probability measure. We shall discuss various ways to deal with this problem.
References
[1] Slides of the lecture
[2] Wolfgang J. Runggaldier, Estimation via stochastic filtering in financial market models

Uwe Wystup: FX exotics and the relevance of computational methods in their pricing and risk management

Starting with an overview of the current FX derivatives industry we take a look at a few examples where computational methods are crucial to run the daily business. The examples will include installment contracts, accumulative forward contracts and the efficient computation of option price sensitivities.
References
[1] The market price of one-touch options in foreign exchange markets, appeared in Derivatives Week March 2003
[2] Jürgen Hakala, Ghislain Perissé and Tino Senge, The pricing of second generation exotics, chapter 7 of Jürgen Hakala and Uwe Wystup, Foreign Exchange Risk
[3] Hatem Ben-Ameur, Michèle Breton and Pascal François, Pricing Installment Options with an Application to ASX Installment Warrants
[4] Uwe Wystup, How the Greeks would have hedged correleation risk of foreign exchange option, chapter 14 of Jürgen Hakala and Uwe Wystup, Foreign Exchange Risk
[5] Slides of the lecture

Short lectures

Remco Peters: Structural Breaks in the Time Change of the S&P 500 index

Recent results indicate that financial processes constitute, in financial time, a Brownian motion. The time change, which maps physical time points to financial time points, is determined by the quadratic variation in the case of a continuous local martingale.
We expose the time change of the S&P 500 future index by using large amounts intraday data. Analyzing this time change we observe several structural breaks. Periods at which the speed of financial time is high alternate with periods at which the speed of financial time is low. Since volatility is intimately related to the time change this indicates that that volatility switches from one regime, at which the volatility is stochastic but constant at its first moment, to another. We shall approximate the time change by a piecewise linear function and discuss the location of the breakpoints.
References
[1] Remco T. Peters, Shocks in the S&P500

Antoine van der Ploeg: A State Space Approach to the Estimation of Multi-Factor Affine Stochastic Volatility Option Pricing Models

We propose a class of stochastic volatility (SV) option pricing models that is more flexible than the more conventional models in different ways. We assume the conditional variance of the stock returns to be driven by an affine function of an arbitrary number of latent factors, which follow mean-reverting Markov diffusions. This set-up, for which we got the inspiration from the literature on the term structure of interest rates, allows us to empirically investigate if volatilities are driven by more than one factor. We derive a call pricing formula for this class. Next, we propose a method to estimate the parameters of such models based on the Kalman filter and smoother, exploiting both the time series and cross-section information inherent in the options and the underlying simultaneously. We argue that this method may be considered an attractive alternative to the efficient method of moments (EMM). We use data on the FTSE100 index to illustrate the method. We provide promising estimation results for a 1-factor model in which the factor follows an Ornstein-Uhlenbeck process. The results indicate that the method seems to work well. Diagnostic checks show evidence of there being more than one factor that drives the volatility, indicate the existence of level-dependent volatility, and provide an incentive to consider realized volatility in our empirical analysis. Further research is clearly needed and is being worked on.
References
[1] Antoine P.C. van der Ploega, H. Peter Boswijk and Frank de Jong, A State Space Approach to the Estimation of Multi-Factor Affine Stochastic Volatility Option Pricing Models

Raoul Pietersz: Projection Iteration Calibration of the Libor BGM Model

This paper tackles the problem of calibrating the Libor BGM model to cap volatility and interest rate correlation by applying the projection iteration algorithm of Dykstra (1983) and Han (1988). The latter algorithm efficiently finds the nearest point over the intersection of closed convex sets by successive projections onto the individual sets. We show that the BGM cap volatility calibration is of this form when minimizing the distance of the model future cap volatility curves to today's curve. The low rank approximation of a correlation matrix however may not be cast in this form because the set of n by n matrices that have rank d < n is not convex. Nonetheless applying the algorithm surprisingly leads to a deficient version of the Lagrange multiplier algorithm of Wu (2002). A slight modification then leads to a converging algorithm that is easier to implement than Wu's. The methods are compared in terms of computational efficiency. (Joint work with Igor Grubisic, Universiteit Utrecht)
References
[1] Igor Grubisic and Raoul Pietersz, Efficient rank reduction of correlation matrices

Alessandro Sbuelz: Equilibrium Asset Pricing with Time-Varying Pessimism

We will study the pricing effects of pessimism, as modelled by Knightian model uncertainty aversion for a neighborhood of indistinguishable model specifications that are constrained in their relative entropy from a given reference model. We will fully characterise the equilibrium of a pessimistic, representative agent, exchange economy with intertemporal consumption, stochastic opportunity set, and a relative entropy constraint that can depend on the state of the economy. We will find that Knightian pessimism yields outstanding First Order Risk Aversion (FORA) excess equity returns. On the other hand, equity dynamics are virtually unaffected. Precisely, riskfree rates and equity premia feel a direct impact of pessimism whereas equity returns and worst case equity premia feel an indirect impact only, which completely disappears with log utility. We will compute and calibrate explicit equilibrium examples of a pessimistic economy whith an amount of pessimism associated to an 11% upper probability of making confusion between the worst case model and the reference model. Relative entropy is the key in fixing such a realistic amount of pessimism in our calibrations. Even for log utility, we will find that such small amount of pessimism generates some 55 basis points more of unconditional equity premium by pushing riskfree rates down. Thus, we will conclude that Knightian pessimism provides an economically and observationally different description of excess equity returns. Its good theoretical and empirical properties could contribute to solve some of the long-standing macro finance puzzles. (Joint work with Fabio Trojani, University of Southern Switzerland.)
References
[1] Alessandro Sbuelz and Fabio Trojani, Equilibrium Asset Pricing with Time-Varying Pessimism
[2] Slides of the lecture


To the homepage of the 3rd Winter School on Financial Mathematics