7th Winter school on Mathematical Finance
Abstracts



Minicourses

Ernst Eberlein: Lévy driven financial models

Empirical analysis of financial data reveals that standard diffusion models do not generate sufficiently accurate return distributions. To reduce model risk more powerful classes of driving processes are appropriate. In this course exponential Lévy models and models driven by semimartingales in general are considered. Plain vanilla as well as exotic options are priced in this new model class. As a further application in risk management we show that estimates of the value at risk of a portfolio of securities are improved.
In the second part we develop a Lévy term structure theory. Three basic approaches are introduced: the forward rate model, the forward process model, and the LIBOR or market model. Pricing formulae for interest rate derivatives as well as efficient numerical algorithms to evaluate these formulae are derived. The LIBOR model is extended to a multi-currency and a credit setting. As an application pricing of cross-currency and a variety of credit derivatives is discussed.
References
An extensive account of relevant reference papers can be found in this list.

Ragnar Norberg: Managing risk in life insurance and pensions

Stochastic processes in life history analysis, life insurance, and finance (jump processes and their associated random measures and martingales, Lévy processes); Traditional paradigms in life insurance (the principle of equivalence, notions of reserves); Management of financial risk and longevity risk (the with profit scheme, unit-linked insurance, securitization of mortality risk); The role of financial instruments in life insurance and pensions - can the markets come to our rescue?
Prerequisites: Stochastic processes, martingale techniques, ordinary and partial differential equations, basic arbitrage pricing theory.
References
Norberg, R. (2003): The Markov chain market. ASTIN Bulletin Vol. 33, pp 265-287. (NB! Typo on page 276, line 10 from below: F should be Λ).
Norberg, R. (1999): A theory of bonus in life insurance. Finance & Stochastics Vol. 3, pp 373-390. (If some participants should experience a problem with access, they are welcome to contact the author on e-mail and he will provide an electronic copy.)
Ragnar Norberg, Managing Risk In Life Insurance And Pensions (slides of the lecture)

Special invited lectures

Andreas Kyprianou: Scale functions and spectrally negative Lévy processes

In this talk I will introduce the notion of a scale function for a downward jumping Lévy process. Having noted its importance in a variety of applications (including finance). Until recently, very few examples of scale functions are known. In this talk I shall expose several new families of completely explicit scale functions. Based on joint work with F. Hubalek and V. Rivero.
References
F. Hubalek and A.E. Kyprianou, Old and new examples of scale functions for spectrally negative Lévy processes, Mathematics ArXiv 0801.0393
Andreas E. Kyprianou, Victor Rivero, Special, conjugate and complete scale functions for spectrally negative Lévy processes, Mathematics ArXiv 0712.3588
Andreas E. Kyprianou, Old and new examples of scale functions for spectrally negative Lévy processes (slides of the lecture)

Uwe Schmock: Risk aggregation, numerical stability and a variation of Panjer's recursion

We present an actuarial one-period model for dependent risks. It extends the well-known collective model for portfolio losses used in actuarial science. It also generalizes CreditRisk+, a credit risk model developed by Credit Suisse First Boston. The presented model is suitable for the aggregation of risks, which may be insurance, credit or operational risks. The guiding principle for the model extensions are analytic tractability and the possibility to calculate the portfolio loss distribution with an efficient and numerically stable algorithm.
Basically, the portfolio loss distribution is a compound Poisson distribution, where the claim size distribution is a mixture distribution consisting of the individual idiosyncratic random losses and the losses caused by K risk factors. For every risk factor, the losses come in clusters of random size, the losses themselves are again mixture distributions of individual losses, whose sizes are random multiples of a basic loss unit. If the cluster size is distributed according to a logarithmic, Poisson or negative binomial distribution, then the loss due to the cluster can be calculated in a numerically stable and efficient way using Panjer's recursion. The final aggregation is also done with this recursion. Therefore, for K risk factors, K+1 Panjer recursions are sufficient. The method allows to calculate quantiles, expected shortfall and risk contributions explicitly without any Monte Carlo simulation.
For cluster size distributions like the extended negative binomial distribution of the extended logarithmic distribution, Panjer's recursion can be numerically unstable. We present a variation of the recursion, which, in particular, solves this issue.
References
Uwe Schmock, Modelling Dependent Credit Risks with Extensions of CreditRisk+ and Application to Operational Risk (lecture notes)
Stefan Gerhold, Uwe Schmock and Richard Warnung, A Generalization of Panjer's Recursion and Numerically Stable Risk Aggregation
Uwe Schmock, Risk Aggregation, Numerical Stability and a Variation of Panjer's Recursion (slides of the lecture)

Michèle Vanmaele: Comonotonicity applied in finance

In finance very often one has to deal with problems where multivariate random variables are involved, e.g. basket options where the price depends on several underlying securities. Asian options or Asian basket options are other examples of options where the price depends on a weighted sum of non-independent asset prices. One can construct upper and lower bounds for the prices of such type of European call and put options based on the theory of stochastic orders and of comonotonic risks. Comonotonicity essentially reduces a multivariate problem to a univariate one, leaving the marginal distributions intact. One can model the dynamics of the underlying asset prices, e.g. use a Black-Scholes model; but it is also possible to develop model-free bounds expressed in terms of in the market observed option prices on the individual underlying assets. Moreover the comonotonic upper bound can be interpreted as a superreplicating strategy. Instead of deriving bounds one can look at approximations, e.g. Monte Carlo (MC) simulation is a technique that provides approximate solutions to a broad range of mathematical problems. A drawback of the method is its high computational cost, especially in a high-dimensional setting. Therefore variance reduction techniques were developed to increase the precision and reduce the computer time. The so-called Comonotonic Monte Carlo simulation uses the comonotonic approximation as a control variate to get more accurate estimates and hence a less time-consuming simulation. We will introduce the notion of comonotonicity and discuss the different approaches based on the theory of comonotonicty. The methods will be applied to examples in finance.
References
Chen, X., Deelstra, G., Dhaene, J., Vanmaele, M.: Static super-replicating strategies for a class of exotic options
Deelstra, G., Vanmaele, M., Vyncke, D.: Minimizing the risk of a financial product using a put option (working paper)
Michèle Vanmaele, Comonotonicity Applied in Finance (slides of the lecture)

Short lectures

Svetlana Borovkova: Modeling commodity forward curves

Commodity markets have recently experienced a dramatic growth, in terms of volumes and variety of traded contracts, number of operating exchanges and market participants. The most important expansion has been in the trading of commodity derivatives, such as futures and options. The most market activity in commodities takes place in trading of forward and futures commodity contracts. In the case of oil, for instance, volumes in futures markets are nine times larger than those in the spot market and this ratio is consistently increasing with the arrival of new financial players.
Traditional modeling techniques from equity and fixed income markets are not directly applicable to modeling commodity forward and futures prices, since specific properties of commodities need to be taken into account. The shape of commodity forward curves reflects market fundamentals and anticipated price trends. The oil forward curve can be in two fundamental states: backwardation or contango, depending on the overall conditions of the world oil market. However, since 2006, some deviations from these two fundamental states have been observed. For seasonal commodities, such as natural gas, electricity and agricultural commodities, the shape of the forward curve is largely determined by the anticipation of seasonal demand and/or supply.
In this talk we give an overview of forward curve modeling techniques for commodities, and describe a recently introduced seasonal forward curve model by Borovkova and Geman (2007). In this two-factor model the average forward price is used as the first fundamental factor. This quantity is devoid of seasonality and conveys a robust representation of the current forward curve level. The second factor is a quantity analogous to the stochastic convenience yield, which accounts for the random changes in the forward curve shape. The well-known cost-of-carry relationship is significantly improved by introducing a deterministic seasonal premium within the convenience yield. We show how the model can be estimated and apply it to a number of energy markets.
References
Svetlana Borovkova and Helyette Geman, Seasonal and stochastic effects in commodity forward curves, Review of Derivatives Research (2006) 9:167 - 186
Svetlana Borovkova, Modelling Energy Forward Curves (slides of the lecture)

An Chen: Approximation solutions for indifference pricing under general utility functions

With the aid of Taylor-based approximations, this paper presents results for pricing insurance contracts by using indifference pricing under general utility functions. We discuss the connection between the resulting "theoretical'' indifference prices and the pricing rule-of-thumb that practitioners use: Best Estimate plus a "Market Value Margin''. Furthermore, we compare our approximations to known analytical results for exponential and power utility. (joint work with Antoon Pelsser and Michel Vellekoop)
References
An Chen, Antoon Pelsser and Michel Vellekoop, Approximate Solutions For Indifference Pricing Under General Utility Functions

Marcel Visser: Measuring volatility

Volatility is a key object in understanding financial price processes. It is well known that volatility changes over time and that there exist periods of low and periods of high volatility. The fact that the volatility process is latent complicates all exercises of model building, estimation, and evaluation. Intraday, high frequency data offer the opportunity to obtain more precise measurements of volatility. In the continuous time diffusion setting the quadratic variation shows up as a natural volatility measure. We will address the issue of using high frequency data to estimate volatility in a large class of discrete time models, such as Garch and Stochastic Volatility. It turns out that, for discrete time models, there exist volatility measures that are more accurate than the quadratic variation.
References
Robin G. de Vilder and Marcel P. Visser (2007), Volatility Proxies for Discrete Time Models
Marcel Visser, Measuring volatility (slides of the lecture)

Andreas Würth: Equivalence of the minimax martingale measure

This paper states that for financial markets with continuous filtrations, the minimax local martingale measure defined by Frittelli is equivalent to the objective measure for nondecreasing but not strictly increasing utility functions if it exists, provided the dual utility function satisfies some boundedness assumptions for the relative risk aversion, and there exists an equivalent local martingale measure which has enough integrability property. The result in this paper essentially extends an earlier result of Delbaen/ Schachermayer, who proved this for the case where the dual utility function is quadratic. Examples for this situation are specifically q-optimal measures for q > 1. The generalization is done using Young functions on Orlicz spaces, and proving a conditional version of the Hölder inequality in this general setup.
References
Andreas Würth, Equivalence of the minimax martingale measure (paper)
Andreas Würth, Equivalence of the minimax martingale measure (slides of the lecture)

Film

Agnes Handwerk and Harrie Willems: Wolfgang Doeblin, A Mathematician Rediscovered (film)

Wolfgang Doeblin, one of the great probabilists of the 20th century, was already widely known in the 1950s for his fundamental contributions to the theory of Markov chains. His coupling method became a key tool in later developments at the interface of probability and statistical mechanics. But the full measure of his mathematical stature became apparent only in 2000 when the sealed envelope containing his construction of diffusion processes in terms of a time change of Brownian motion was finally opened, 60 years after it was sent to the Academy of Sciences in Paris.
This film documents scientific and human aspects of this amazing discovery and throws new light on the startling circumstances of Doeblin's death at the age of 25.
References:
Agnes Handwerk and Harrie Willems (2007), Wolfgang Doeblin, A Mathematician Rediscovered (DVD), Springer, VideoMATH
Harrie Willems (2002), Verzegelde formules, Nieuw Archief voor Wiskunde, 5 (3)



To the homepage of the 7th Winter School on Mathematical Finance