Minicourses
Nizar Touzi: Hedging under constraints by face lifting and optimal stopping
We first consider the classical problem of super-replication under portfolio constraints as formulated by Cvitanic and Karatzas (1995). In the Markov framework, the value function can be characterized as the unique viscosity solution of the associated dynamic programming equation without any need to the so-called dual formulation of the problem. In the context of the Black-Scholes model, we only use the viscosity super-solution part of this result in order to derive an explicit solution of the problem. Namely, the value function is given by the (unconstrained) Black-Scholes price of a conveniently face-lifted payoff. This result was first established by Broadie, Cvitanic and Soner (1998).
We next formulate a super-replication problem under Gamma constraints. This leads to a non-standard stochastic control problem which appeals to many different tools from stochastic analysis and stochastic control theory. The final result is that the value function of the super-replication problem is given by the value of an optimal stopping problem (American option) with reward defined by a convenient face-lift of the payoff.
References
[1] Hedging under constraints by face lifting and optimal stopping (slides of the lectures).
Hanspeter Schmidli: On the Interplay between Insurance and Finance
In recent years the ideas of mathematical finance turned out to be useful for
solving insurance risk problems. We will consider two possible applications.
The classical approach to life insurance has two problems. Interest rate
guarantees for thirty to forty years cannot be hedged, and the guaranteed
interest rate is not attractive for investors. The problem was solved by
making the payout of a life insurance contract dependent on the value of a
reference portfolio. Pricing of such contracts involves then pricing in
financial markets (payout) and actuarial pricing (mortality). We will discuss
pricing and hedging of these so-called unit-linked life insurance contracts.
In non-life insurance catastrophes like Hurricane Andrew or the Northridge
Earthquake showed that a worst case catastrophe could ruin the whole
insurance world. It was observed that the financial markets easily could bear
the risk. One therefore tries to securitize catastrophe insurance contracts
via the financial markets. A first product appearing on the market were CAT
futures and CAT options. Here an option is written with catastrophe losses or
a catastrophe index (like the PCS index) as underlying. Another type of
securitization products are CAT bonds. Here bonds are issued where the coupon
will become void whenever a certain well-specified event occurs. In addition,
also the principal can be on risk. The pricing problem with these CAT
products is that markets are incomplete because the underlying is not a
traded asset. Another problem is the lack of realistic models. We will
discuss how to model the underlying, and how prices could be obtained. We
also discuss how an insurer can hedge an insurance portfolio using these
securitization products.
References
[1] Hanspeter Schmidli, On the Interplay
between Insurance and Finance, Lecture note to the 4th Winter school on
Financial Mathematics
[2] Hanspeter Schmidli, Modelling
PCS options via individual indices, CAF Working Paper nr. 157
[3] Claus Vorm Christensen, Securitization of
insurance risk, PhD thesis, University of Aarhus
Special invited lectures
Damir Filipovic:
Risk-based solvency testing for insurers
We discuss recent developments in risk-based insurance regulation. The
European Solvency II framework proposes three pillars, the first of which
is about risk-bearing capital requirements. A particular focus of this
talk is on the Swiss Solvency Test for insurers.
References
[1] Risk-Based Solvency Testing for Insurers (slides of the lecture).
Marco Frittelli: On utility maximization in incomplete markets
Expected utility maximization in continuous-time stochastic incomplete
markets is a very well known problem that received a renovated impulse in
the middle of the Eighties, when the so-called duality approach to the
problem was first developed.
During the past twenty years, the theory has constantly improved, but a case
has been left apart: exactly the situation examined in this talk where the
semi-martingale X, describing the price evolution of a finite number of
assets, can be possibly unbounded. This is a non-trivial extension, from a
mathematical but also from a financial point of view.
In fact, in highly risky markets (i.e. with unbounded losses in the trading:
think of X as a compounded Poisson process on a finite horizon, with
unbounded jumps) the traditional approach to the problem leads to trivial
maximization: the optimal choice for the agent would be investing the
initial endowment entirely in the risk free asset.
However, it could happen that some of the investors are willing to take a
greater risk: mathematically speaking, they accept trading strategies that
may lead to unbounded losses. This risk-taking attitude gives them the
concrete possibility of increasing their expected utility from terminal
wealth.
We also show that the super-martingale property of the optimal portfolio
process hold true even in the non-locally bounded case.
As it is widely known, the utility maximization problem is linked to
derivative pricing through the so-called indifference pricing technique.
Such a technique is far from being a theoretical speculation, since it is
currently used by financial institutions to price new and/or illiquid
derivatives. The results here presented allow tackling this problem in the
general case of a non-necessarily locally bounded semi-martingale price
process.
References
[1] On Utility Maximization in Incomplete Markets (slides of the lecture).
[2] Bibliography.
Farshid Jamshidian: Numeraire-invariant option pricing & american, bermudan, and
trigger stream rollover
Part I proposes a numeraire-invariant
option pricing framework. It defines an option, its price
process, and such notions as option indistinguishability and
equivalence, domination, payoff process, trigger option, and
semipositive
option. It develops some of their basic properties, including
price transitivity law, indistinguishability results, convergence results,
and, in relation to nonnegative arbitrage, characterizations of
semipositivity and consequences thereof. These are applied in Part
II to study the Snell envelop and american options. The
measurability and right-continuity of the former is established
in general. The american option is then defined, and its pricing
formula (for all times) is presented. Applying a concept of a
domineering numeraire for superclaims derived from (the
additive) Doob-Meyer decomposition, minimax duality formulae are
given which resemble though differ from those obtained by Rogers and by
Haugh and Kogan. Multiplicative Doob-Meyer decomposition is discussed
last. A part III is also envisaged.
References
[1] Farshid Jamshidian, Numeraire-invariant option pricing
& american, bermudan, and trigger stream rollover, preprint.
[2] Farshid Jamshidian, Numeraire-invariant option pricing (slides of the lecture).
Short lectures
Bart Oldenkamp:
The practice of financial theory
Making progress in financial theory often requires making assumptions that don't hold in practice. Nevertheless theoretical results can offer relevant insights for practitioners. The challenge for an asset manager is to asses which theoretical results are indeed useful in practice and how they can be used in the design of investment policies and in day to day money management. In this talk, I will discuss cases taken from the practice of ABN AMRO Structured Asset Management.
Sophie Ladoucette: Reinsurance of large claims
The large claims reinsurance treaty ECOMOR (excédent du coût moyen relatif)
introduced by Thépaut (1950) is well known not to be very popular and has been largely
neglected by most reinsurers because of its technical complexity. We propose new
mathematical results related to distributional problems of this reinsurance form which
can reopen the discussion on the usefulness of including the largest claims in
the decision making procedure. As such, we also examine more closely potential applications
of extreme value theory to reinsurance.
The ECOMOR treaty Rr(t) is defined via the upper order statistics of a random sample.
In some sense it rephrases the largest claims treaty, another reinsurance treaty of
extreme value type. But it can also be considered as an excess-of-loss treaty with
a random retention determined by the (r + 1)th largest claim related to a specific
portfolio. Specifically, the reinsurer covers the claim amount:
Rr(t) = Σ
1≤i≤r
X*N (t)-i+1 -
rX*N (t)-r
for a fixed number r≥1.
The quantity Rr(t) is then a function of the r+1 upper order
statistics X*N (t)-r ≤...≤
X*N (t)
in a randomly indexed sample X1,...,XN(t) of i.i.d.
claims which occur up to time epoch t ≥ 0. These claims determine the accumulated
claim amount in the random sum SN(t) :=
Σ1≤i≤N(t)
Xi. Throughout, we assume the claim
number process {N(t); t ≥ 0} to be a counting process independent of the claim size
process {Xi, i∈N*}.
In a first part, we are interested in the asymptotic relation between the tail of the
distribution F of a claim size X and that of the quantity Rr(t). We get accurate
asymptotic equivalences and asymptotic bounds. Also, in the sub-exponential case, we
give a result showing the interplay between the accumulated claim amount SN(t) and the
reinsured quantity Rr(t). In a second part, we turn to the ratio of the quantities Rr(t)
and the accumulated claim amount SN(t). We give conditions that imply a dominant
influence of the quantities Rr(t) on this sum. Finally, we touch on the question of
convergence in distribution for some quantities Rr(t). We get precise first and second
order large deviation results for the case where the claim size distribution F belongs
to an extremal class, eventually with remainder. The outcomes are illustrated with a
number of simulations. (joint work with J.L. Teugels)
References
[1] Ladoucette S.A., Teugels J.L. (2004): Reinsurance of large claims, EURANDOM Report 2004-025 (abstract, text), Technical University of Eindhoven, The Netherlands.
David Schrager: Affine Stochastic Mortality
We propose a new model for stochastic mortality. The model is based on the literature on affine term
structure models. It satisfies three important requirements for application in practice: analytical tractibility,
clear interpretation of the factors and compatibility with financial option pricing models. We test the model
using data on Dutch mortality rates. Furthermore we discuss the specification of a market price of
mortality risk and apply the model to the pricing of a Guaranteed Annuity Option and the calculation of
required Economic Capital for mortality risk.
References
[1] David F. Schrager, Affine Stochastic Mortality, preprint.
Alex Zilber: FX barriers with smile dynamics
Our mandate in this work has been to isolate the features of smile consistent models
that are most relevant to the pricing of barrier options. We consider the two
classical approaches of stochastic and (parametric) local volatility. Although neither
has been particularly successful in practice their differing qualitative features
serve our exposition. By constructing approximate static hedges we are able to
closely mimic their prices. The only information we require from the models,
other than the initial vanilla market to which they are calibrated, is their conditional
forward smile along the barrier. This strongly supports the fact that realistic
forward smile dynamics are of paramount importance when assessing a model to
be used in pricing barrier options. (joint work with Glyn Baker and Reimer Beneder)
References
[1] Glyn Baker, Reimer Beneder and Alex Zilber, FX Barriers with Smile Dynamics, preprint.
|