The Python Quants



DX Analytics

A Pythonic Library for Simulation-Based Derivatives Analytics

Dr. Yves J. Hilpisch

The Python Quants GmbH

www.pythonquants.com

yves@pythonquants.com

@dyjh

PyData London – 23 February 2014

Overview

This talk focuses on

  • brief history of (relevant) finance research
  • analytical vs. numerical option pricing
  • Python for derivatives analytics
  • overview of DX Analytics
  • selected use cases for DX Analytics

A brief bio:

  • Managing Director of The Python Quants GmbH
  • Founder of Visixion GmbH – The Python Quants
  • Lecturer Mathematical Finance at Saarland University
  • Book "Derivatives Analytics with Python"
  • Book "Python for Finance" (in July 2014, O'Reilly)
  • Ph.D. in Mathematical Finance
  • Graduate in Business Administration
  • Martial Arts Enthusiast

Visit www.hilpisch.com.

You can find these Slides here:

http://hilpisch.com/YH_DX_Analytics_London.html

A Brief History of Option Pricing Research

Important Milestones (I), from a rather personal perspective.

  • Bachelier (1900) – Option Pricing with Arithmetic Brownian Motion
  • Einstein (1905) – Rigorous Theory of Brownian Motion
  • Samuelson (1950s) – Systematic Approaches to Option Pricing
  • Black Scholes Merton (1973) – Geometric Brownian Motion, Analytical Option Pricing Formula
  • Merton (1976) – Jump Diffusion, Option Pricing with Jumps
  • Boyle (1977) – Monte Carlo for European options
  • Cox Ross Rubinstein (1979) – Binomial Option Pricing

Important Milestones (II), from a rather personal perspective.

  • Harrison Kreps Pliska (1979/1981) – General Risk-Neutral Valuation, Fundamental Theorem of Asset Pricing
  • Cox Ingersoll Ross (1985) – Intertemporal General Equilibrium, Short Rates with Square-Root Diffusion
  • Heston (1993) – Stochastic Volatility, Fourier-based Option Pricing
  • Bates (1996) – Heston (1993) plus Merton (1976)
  • Bakshi Cao Chen (1997) – Bates (1996) plus Cox Ingersoll Ross (1985)
  • Carr Madan (1999) – Using Fast Fourier Transforms for Option Pricing
  • Longstaff Schwartz (2001) – Monte Carlo for American Options
  • Gamba (2003) – Monte Carlo for (Complex) Portfolios of Interdependent Options

Analytical vs. Numerical Option Pricing

The Famous Black-Scholes-Merton Formula

Black Scholes Merton (1973) consider a financial model with a geometric Brownian motion representing the value of the only traded, risky asset:

\(dS_t = r S_t dt + \sigma S_t dZ_t\)

with \(S\) the stock price, \(r\) the risk-free rate, \(\sigma\) the volatility and \(Z\) a Brownian motion.

The other traded asset is a risk-less bond yielding the risk-free interest rate:

\(dB_t = r B_t\)

In this setting, Black Scholes Merton (1973) derive a closed-form solution for the price of a European call option as follows:

\[ \begin{eqnarray*} C(S_t, K, T, r, \sigma^{imp}) &=& S_{t} \cdot \mathbf{N}(d_{1})-e^{-r(T-t)} \cdot K \cdot \mathbf{N}(d_{2}) \\ \mathbf{N}(d)&=&\frac{1}{\sqrt{2\pi}}\int_{-\infty}^{d}e^{-\frac{1}{2}x^{2}}dx \\ d_{1}&=&\frac{\log \frac{S_{t}}{K}+(r+\frac{\sigma^{2}}{2})(T-t)}{\sigma \sqrt{T-t}} \\ d_{2}&=&\frac{\log \frac{S_{t}}{K}+(r-\frac{\sigma^{2}}{2})(T-t)}{\sigma \sqrt{T-t}} \end{eqnarray*} \]

Here, \(\mathbf{N}\) stands for the cdf of the normal distribution.

In [1]:
class call_option_analytical(object):
    ''' Class for European call option valuation in BSM model.
    Analytical fomula.'''    
    from math import log, sqrt, exp
    from scipy import stats
    global log, sqrt, exp, stats
    
    def __init__(self, S0, K, T, r, sigma):
        self.S0 = float(S0)
        self.K = K
        self.T = T
        self.r = r
        self.sigma = sigma
    def present_value(self):
        d1 = ((log(self.S0 / self.K) + (self.r + 0.5 * self.sigma ** 2) * T)
                / (self.sigma * sqrt(T)))
        d2 = ((log(self.S0 / self.K) + (self.r - 0.5 * self.sigma ** 2) * T)
                / (self.sigma * sqrt(T)))
        value = (self.S0 * stats.norm.cdf(d1, 0.0, 1.0)
                - self.K * exp(-self.r * self.T) * stats.norm.cdf(d2, 0.0, 1.0))
        return value

Let us generate a benchmark example for a European call option valuation.

In [2]:
S0 = 100
K = 100
T = 1.0
r = 0.05
sigma = 0.2
In [3]:
call = call_option_analytical(S0, K, T, r, sigma)
In [4]:
call.present_value()
Out[4]:
10.450583572185565

One of the undeniable advantages of the analytical pricing formula is the speed when it comes to the numerical evaluation.

In [5]:
%%time
import numpy as np
values = []
for K in np.linspace(80, 120, 5000):
    call.K = K
    values.append(call.present_value())
K = 100.
CPU times: user 1.21 s, sys: 2.97 ms, total: 1.21 s
Wall time: 1.21 s

The Monte Carlo Simulation Alternative

The Monte Carlo estimator for the European call option's value with strike \(K\) is given by

\(C_0 \approx e^{-rT} \frac{1}{I} \sum_I \max[S_T(i)-K, 0]\)

where we can simulate the values of the risky asset at maturity using the log Euler discretization scheme

\(\log S_t = \log S_{t - \Delta t} + (r - 0.5 \sigma^2 ) \Delta t + \sigma \sqrt{\Delta t} z_t\)

In [6]:
class call_option_monte_carlo(object):
    ''' Class for European call option valuation in BSM model.
    Monte Carlo simulation'''    
    import numpy as np    
    def __init__(self, S0, K, T, r, sigma, M, I):
        self.S0 = float(S0)
        self.K = K; self.T = T
        self.r = r; self.sigma = sigma
        self.M = M; self.I = I
    def generate_paths(self):
        dt = self.T / self.M
        np.random.seed(0)
        ran = np.random.standard_normal((self.M + 1, self.I))
        ran = (ran - ran.mean()) / ran.std()
        paths = S0 * np.exp(np.cumsum((self.r - 0.5 * self.sigma ** 2) * dt
                + self.sigma * np.sqrt(dt) * ran, axis=0))
        paths[0] = S0
        return paths
    def present_value(self):
        paths = self.generate_paths()
        present_value = (np.exp(-self.r * self.T)
                         * np.sum(np.maximum(paths[-1] - self.K, 0)) / self.I)
        return present_value

A first trial with the Monte Carlo based valuation approach.

In [7]:
call_mcs = call_option_monte_carlo(S0, K, T, r, sigma, M=50, I=10000)
In [8]:
%%time
print call_mcs.present_value()
10.4610933796
CPU times: user 41.2 ms, sys: 9.44 ms, total: 50.6 ms
Wall time: 49.7 ms

Already quite good.

Adding more paths only improves the convergence of the MCS estimator in the limit.

In [9]:
%%time
call_mcs.I = 50000  # more paths
print call_mcs.present_value()  
10.6068704869
CPU times: user 261 ms, sys: 39.9 ms, total: 301 ms
Wall time: 300 ms

However, we also have the number/length of time intervals as a second parameter to improve convergence and precision; although at the cost of a longer computing time.

In [10]:
%%time
call_mcs.M = 500  # more steps
print call_mcs.present_value()  # more time steps
10.4487236591
CPU times: user 2.32 s, sys: 715 ms, total: 3.04 s
Wall time: 3.11 s

Some General Considerations

Potential problems with analytical formulae for pricing/valuation:

  • black box
  • limited/specific use cases
  • no consistent value aggregation
  • no consistent risk aggregation/netting
  • no full distribution (only point estimate)
  • no incorporation of correlation

Sometimes significant advantages of analytical formulae for pricing/valuation:

  • often easy to understand and apply (without knowing about the theory behind it)
  • in general high speed for numerical evaluations
  • low technical requirements (computing power, memory)
  • implementable on "every" machine
  • easy testing/maintenance of implemented formulae

Alleged drawbacks of alternative numerical methods (such as simulation):

  • compute intensive
  • memory intensive
  • accuracy issues (e.g. convergence)
  • conceptual problems (e.g. American exercise)
  • implementation issues (e.g. algorithms, performance, memory)
  • testing/maintenance issues (e.g. test coverage)

On the other hand, many benefits from numerical methods (such as simulation):

  • highly flexible and powerful (any model, any derivative, any portfolio)
  • full distributional view (values, risk, portfolio)
  • consistent value aggregation (path-wise)
  • consistent risk measures (full distribution)
  • consistent estimation of netting effects (path-wise)
  • consistent incorporation of (time-varying) correlations
  • full graphical analysis (no black boxes or corners)

Our class for the Monte Carlo valuation allows for valuable insights "behing the scenes".

In [11]:
import matplotlib.pyplot as plt
%matplotlib inline
plt.figure(figsize=(8, 5))
plt.hist(call_mcs.generate_paths()[-1], bins=30)
plt.grid(True); plt.xlabel('value at maturity'); plt.ylabel('frequency')
Out[11]:
<matplotlib.text.Text at 0x1177141d0>

Advances in Technology

Not too long ago ...

  • desktops, notebooks (maybe connected via Internet)
  • dedicated, individual servers (if at all)
  • single or few cores, little memory
  • low level programming languages (C, Fortran)
  • mainly sequential execution of code
  • (prohibitively) high costs for high performance

Today, we are a bit more lucky ...

  • desktops, notebooks as workstations
  • Cloud-based computing (with custom nodes)
  • many cores, large memory, powerful GPUs
  • high level programming languages (Python)
  • on demand scalability and parallelization
  • low (variable) costs for even large machines

DX Analytics

General ideas and approaches:

  • general risk-neutral valuation ("Global Valuation")
  • Monte Carlo simulation for valuation, Greeks
  • Fourier-based formulae for calibration
  • arbitrary models for stochastic processes
  • single & multi risk derivatives
  • European and American exercise
  • completely implemented in Python
  • hardware not a limiting factor

Global Valuation at its core:

  • non-redundant modeling (e.g. of risk factors)
  • consistent simulation (e.g. given correlations)
  • consistent value and risk aggregation
  • single positions or consistent portfolio view

Python specifics:

  • heavy use of classes and object orientation
  • NumPy ndarray class for numerical data
  • Python dictionaries (mainly) for meta data

Some Simple Examples

In [12]:
from DX_Lib_Valuation import *

Discounting

One of the most simple classes is the one for constant short rate discounting.

In [13]:
r = constant_short_rate('short_rate', 0.05)
In [14]:
time_list = pd.date_range(start='2014/6/1', end='2014/12/31', freq='M').to_pydatetime()
time_list
Out[14]:
array([datetime.datetime(2014, 6, 30, 0, 0),
       datetime.datetime(2014, 7, 31, 0, 0),
       datetime.datetime(2014, 8, 31, 0, 0),
       datetime.datetime(2014, 9, 30, 0, 0),
       datetime.datetime(2014, 10, 31, 0, 0),
       datetime.datetime(2014, 11, 30, 0, 0),
       datetime.datetime(2014, 12, 31, 0, 0)], dtype=object)

The class generates discount factors for a given list of datetime objects.

In [15]:
r.get_discount_factors(time_list)  # unit zero-coupon bond values
Out[15]:
array([[datetime.datetime(2014, 6, 30, 0, 0), 0.97510952647028781],
       [datetime.datetime(2014, 7, 31, 0, 0), 0.97925920727161508],
       [datetime.datetime(2014, 8, 31, 0, 0), 0.98329184073585585],
       [datetime.datetime(2014, 9, 30, 0, 0), 0.98747634223321412],
       [datetime.datetime(2014, 10, 31, 0, 0), 0.99154281422884716],
       [datetime.datetime(2014, 11, 30, 0, 0), 0.99576242860877573],
       [datetime.datetime(2014, 12, 31, 0, 0), 1.0]], dtype=object)

Alternatively, you can provide relative points in time as fractions of years.

In [16]:
time_list_delta = get_year_deltas(time_list)
In [17]:
r.get_discount_factors(time_list_delta, dtobjects=False)
Out[17]:
array([[ 0.        ,  0.97510953],
       [ 0.08493151,  0.97925921],
       [ 0.16986301,  0.98329184],
       [ 0.25205479,  0.98747634],
       [ 0.3369863 ,  0.99154281],
       [ 0.41917808,  0.99576243],
       [ 0.50410959,  1.        ]])

Market Environment

For almost all (model & valuation) classes, we need to define a market environment.

In [18]:
me_jd = market_environment('jd', dt.datetime(2014, 7, 1))
In [19]:
type(me_jd)
Out[19]:
DX_Lib_Frame.market_environment

A market environment mainly collects necessary constants and curves.

In [20]:
me_jd.add_constant('initial_value', 100.)
me_jd.add_constant('volatility', 0.2)
me_jd.add_constant('lambda', 0.7)  # jump probability
me_jd.add_constant('mu', -0.8)  # average jump size
me_jd.add_constant('delta', 0.2)  # volatility of jump
me_jd.add_constant('currency', 'USD')  # currency
me_jd.add_constant('frequency', 'B')  # pandas date_range frequency string
me_jd.add_constant('paths', 10000)  # number of paths
me_jd.add_constant('final_date', dt.datetime(2014, 12, 31))
In [21]:
me_jd.add_curve('discount_curve', r)  # instance of discount class

Model

Equipped with an instance of a market environment, you can initialize a model class, e.g. for a jump diffusion.

In [22]:
jd = jump_diffusion('jd_obj', me_jd)
In [23]:
paths = jd.get_instrument_values(fixed_seed=True)
In [24]:
paths[-1]  # all values at final date
Out[24]:
array([ 117.52748895,  105.12635975,   88.36302764, ...,   99.32911215,
         92.31849354,   60.24410991])

Since you have access to all the data, detailed plots are only a couple of lines of code away.

In [25]:
import matplotlib.pyplot as plt
%matplotlib inline
plt.figure(figsize=(8, 5))
plt.plot(jd.time_grid, paths[:, 10:20])
plt.grid(True); plt.xlabel('date'); plt.ylabel('index level')
Out[25]:
<matplotlib.text.Text at 0x10d71a450>

Options

For valuation purposes, there are a number of valuation classes available, e.g. for European options with "arbitrary" payoffs. Again a market environment is needed.

In [26]:
me_call = market_environment('call', dt.datetime(2014, 7, 1))
In [27]:
me_call.add_constant('maturity', dt.datetime(2014, 12, 31))
me_call.add_constant('currency', 'USD')
In [28]:
call = valuation_mcs_european('call', me_call, jd, payoff_func='np.maximum(maturity_value - 100., 0)')

The valuation provides a number of methods, e.g. for valuation and Greeks estimation.

In [29]:
call.present_value()
Out[29]:
4.808783
In [30]:
call.delta()
Out[30]:
0.41847999999999885
In [31]:
call.vega()
Out[31]:
19.11670000000001

Let us compile a couple of more statistics for the call option.

In [32]:
%%time
s_list = np.arange(80., 120.1, 5.)
pv = []; de = []; ve = []
for s in s_list:
    call.update(initial_value=s)
    pv.append(call.present_value())
    de.append(call.delta())
    ve.append(call.vega())
call.update(initial_value=100.)
CPU times: user 7.63 s, sys: 526 ms, total: 8.15 s
Wall time: 8.17 s

The DX Analytics library provides a couple of standardized plotting functions.

In [33]:
plot_option_stats(s_list, pv, de, ve)

Multi-Risk, Multi-Derivatives Portfolio Case Study

We now want to implement a more complex case with multiple, correlated risk factors and multiple derivatives.

The Market

The discounting class is central for risk-neutral pricing.

In [34]:
# constant short rate
r = constant_short_rate('r', 0.06)

Our market consists of three assets.

In [35]:
# market environments
me_gbm = market_environment('gbm', dt.datetime(2014, 1, 1))
me_jd = market_environment('jd', dt.datetime(2014, 1, 1))
me_sv = market_environment('sv', dt.datetime(2014, 1, 1))

Market environment for geometric Brownian motion.

In [36]:
# geometric Brownian motion
me_gbm.add_constant('initial_value', 36.)
me_gbm.add_constant('volatility', 0.2) 
me_gbm.add_constant('currency', 'EUR')
me_gbm.add_constant('model', 'gbm')

Market environment for jump diffusion.

In [37]:
# jump diffusion
me_jd.add_constant('initial_value', 36.)
me_jd.add_constant('volatility', 0.2)
me_jd.add_constant('lambda', 0.5)
    # probability for jump p.a.
me_jd.add_constant('mu', -0.75)
    # expected jump size [%]
me_jd.add_constant('delta', 0.1)
    # volatility of jump
me_jd.add_constant('currency', 'EUR')
me_jd.add_constant('model', 'jd')

Market environment for stochastic volatility model.

In [38]:
# stochastic volatility model
me_sv.add_constant('initial_value', 36.)
me_sv.add_constant('volatility', 0.2)
me_sv.add_constant('vol_vol', 0.1)
me_sv.add_constant('kappa', 2.5)
me_sv.add_constant('theta', 0.4)
me_sv.add_constant('rho', -0.5)
me_sv.add_constant('lambda', 0.0)
    # probability for jump p.a.
me_sv.add_constant('mu', 0.0)
    # expected jump size [%]
me_sv.add_constant('delta', 0.0)
    # volatility of jump
me_sv.add_constant('currency', 'EUR')
me_sv.add_constant('model', 'svjd')

You can share common constants and curves by adding whole market environment objects to another one.

In [39]:
# valuation environment
val_env = market_environment('val_env', dt.datetime(2014, 1, 1))
val_env.add_constant('paths', 5000)
val_env.add_constant('frequency', 'W')
val_env.add_curve('discount_curve', r)
val_env.add_constant('starting_date', dt.datetime(2014, 1, 1))
val_env.add_constant('final_date', dt.datetime(2014, 12, 31))
In [40]:
# add valuation environment to market environments
me_gbm.add_environment(val_env)
me_jd.add_environment(val_env)
me_sv.add_environment(val_env)

We have now our market together. What is still missing are correlations.

In [41]:
underlyings = {'gbm' : me_gbm, 'jd' : me_jd, 'sv' : me_sv}
correlations = [['gbm', 'jd', 0.5], ['jd', 'sv', -0.75]]

The Derivatives

American Put Option

First, the market model for the underlying.

In [42]:
gbm = geometric_brownian_motion('gbm_obj', me_gbm)
In [43]:
me_put = market_environment('put', dt.datetime(2014, 1, 1))
me_put.add_constant('maturity', dt.datetime(2014, 12, 31))
me_put.add_constant('strike', 40.)
me_put.add_constant('currency', 'EUR')
me_put.add_environment(val_env)

Then the model for the option.

In [44]:
am_put = valuation_mcs_american('am_put', mar_env=me_put, underlying=gbm,
                       payoff_func='np.maximum(strike - instrument_values, 0)')
In [45]:
am_put.present_value(fixed_seed=False, bf=5)
Out[45]:
4.495371

European Maximum Call on 2 Assets

We need a second underlying for the Maximum Call option.

In [46]:
jd = jump_diffusion('jd_obj', me_jd)
In [47]:
me_max_call = market_environment('put', dt.datetime(2014, 1, 1))
me_max_call.add_constant('maturity', dt.datetime(2014, 9, 15))
me_max_call.add_constant('currency', 'EUR')
me_max_call.add_environment(val_env)

Payoff definitions in this case must specify the underlying.

In [48]:
payoff_call = "np.maximum(np.maximum(maturity_value['gbm'], maturity_value['jd']) - 34., 0)"

We use a sub-set of the whole market for the valuation.

In [49]:
assets = {'gbm' : me_gbm, 'jd' : me_jd}
asset_corr = [correlations[0]]
In [50]:
max_call = valuation_mcs_multi_european('max_call', me_max_call, assets, asset_corr,
        payoff_func=payoff_call)
In [51]:
max_call.present_value(fixed_seed=False)
Out[51]:
5.69469

Of course, you now have two different Delta values.

In [52]:
max_call.delta('gbm')
Out[52]:
0.5525599999999997
In [53]:
max_call.delta('jd')
Out[53]:
0.32917000000000307

Let us generate a somehow broader overview of the option statistics. We start with the present value of the option given different initial values for the two underlyings.

In [54]:
asset_1 = np.arange(28., 46.1, 2.)
asset_2 = asset_1
a_1, a_2 = np.meshgrid(asset_1, asset_2)
value = np.zeros_like(a_1)
In [55]:
%%time
for i in range(np.shape(value)[0]):
    for j in range(np.shape(value)[1]):
        max_call.update('gbm', initial_value=a_1[i, j])
        max_call.update('jd', initial_value=a_2[i, j])
        value[i, j] = max_call.present_value()
CPU times: user 4.2 s, sys: 185 ms, total: 4.38 s
Wall time: 4.4 s

And the graphical representation of the option present values.

In [56]:
plot_greeks_3d([a_1, a_2, value], ['gbm', 'jd', 'present value'])

Now the surfaces for the Greeks. Delta values first.

In [57]:
delta_1 = np.zeros_like(a_1)
delta_2 = np.zeros_like(a_1)
In [58]:
%%time
for i in range(np.shape(delta_1)[0]):
    for j in range(np.shape(delta_1)[1]):
        max_call.update('gbm', initial_value=a_1[i, j])
        max_call.update('jd', initial_value=a_2[i, j])
        delta_1[i, j] = max_call.delta('gbm')
        delta_2[i, j] = max_call.delta('jd')
CPU times: user 11.4 s, sys: 517 ms, total: 11.9 s
Wall time: 12 s

The surface for the gbm Delta.

In [59]:
plot_greeks_3d([a_1, a_2, delta_1], ['gbm', 'jd', 'delta gbm'])

The surface for the jd Delta.

In [60]:
plot_greeks_3d([a_1, a_2, delta_2], ['gbm', 'jd', 'delta jd'])

Finally, the values for the two Vega dimensions.

In [61]:
vega_1 = np.zeros_like(a_1)
vega_2 = np.zeros_like(a_1)
In [62]:
%%time
for i in range(np.shape(vega_1)[0]):
    for j in range(np.shape(vega_1)[1]):
        max_call.update('gbm', initial_value=a_1[i, j])
        max_call.update('jd', initial_value=a_2[i, j])
        vega_1[i, j] = max_call.vega('gbm')
        vega_2[i, j] = max_call.vega('jd')
CPU times: user 10.7 s, sys: 510 ms, total: 11.2 s
Wall time: 11.2 s

The surface for the gbm Vega.

In [63]:
plot_greeks_3d([a_1, a_2, vega_1], ['gbm', 'jd', 'vega gbm'])

The surface for the jd Vega.

In [64]:
plot_greeks_3d([a_1, a_2, vega_2], ['gbm', 'jd', 'vega jd'])

American Minimum Put on 2 Assets

Finally, the third underlying needed.

In [65]:
sv = jump_diffusion('sv_obj', me_sv)
In [66]:
me_min_put = market_environment('min_put', dt.datetime(2014, 1, 1))
me_min_put.add_constant('maturity', dt.datetime(2014, 6, 17))
me_min_put.add_constant('currency', 'EUR')
me_min_put.add_environment(val_env)

This time, the derivative is a Minimum put option with American exercise.

In [67]:
payoff_put = "np.maximum(32. - np.minimum(instrument_values['jd'], instrument_values['sv']), 0)"
In [68]:
assets = {'jd' : me_jd, 'sv' : me_sv}
asset_corr = [correlations[1]]
asset_corr
Out[68]:
[['jd', 'sv', -0.75]]
In [69]:
min_put = valuation_mcs_multi_american('min_put', val_env=me_min_put, assets=assets,
                       correlations=asset_corr, payoff_func=payoff_put)
In [70]:
min_put.present_value(fixed_seed=True)
Out[70]:
4.620787

Again, two different values for the Greeks.

In [71]:
min_put.vega('jd')
Out[71]:
4.802200000000045
In [72]:
min_put.vega('sv')
Out[72]:
2.227300000000021

Derivatives Positions

The derivative class specifies a whole derivatives position.

In [73]:
am_put_pos = derivative('am_put_pos', 2, ['gbm'], me_put, 'American',
                      'np.maximum(instrument_values - 36., 0)')
In [74]:
max_call_pos = derivative('max_call_pos', 3, ['gbm', 'jd'], me_max_call,
                          'Multi European', payoff_call)
In [75]:
min_put_pos = derivative('min_put_pos', 5, ['sv', 'jd'], me_min_put,
                         'Multi American', payoff_put)

Portfolio

All is together to compose a portfolio with the derivatives positions as defined before.

In [76]:
positions = {'am_put_pos' : am_put_pos, 'max_call_pos' : max_call_pos,
             'min_put_pos' : min_put_pos}
In [77]:
port = portfolio('portfolio', positions, val_env, underlyings, correlations)

DX Analytics allows for an integrated, consistent valuation of the portfolio given e.g. correlations between underlyings.

In [78]:
%%time
port.get_statistics()
CPU times: user 3.5 s, sys: 132 ms, total: 3.64 s
Wall time: 3.61 s

Out[78]:
position name quantity value currency pos_value pos_delta pos_vega
0 max_call_pos max_call_pos 3 5.677053 EUR 17.031159 {'jd': 0.9873, 'gbm': 1.68567} {'jd': 18.3546, 'gbm': 25.8762}
1 am_put_pos am_put_pos 2 3.770114 EUR 7.540228 1.30004 26.019
2 min_put_pos min_put_pos 5 4.617262 EUR 23.086310 {'jd': -0.88965, 'sv': -0.84235} {'jd': 25.251, 'sv': 9.617}

3 rows × 8 columns

Let's have a look at two paths of two correlated underlyings.

In [79]:
path_no = 20
paths1 = port.underlying_objects['gbm'].get_instrument_values()[:, path_no]
paths2 = port.underlying_objects['jd'].get_instrument_values()[:, path_no]

plt.figure(figsize=(9, 5))
plt.plot(port.time_grid, paths1, 'r', label='gbm')
plt.plot(port.time_grid, paths2, 'b', label='jd')
plt.gcf().autofmt_xdate()
plt.legend(loc=0); plt.grid(True)
# positive correlation case

The architecture and approach lends itself pretty well for risk management, scenario analysis and stress testing. For example, you could consider an alternative set of correlations.

In [80]:
correlations = [['gbm', 'jd', -0.7], ['jd', 'sv', 0.7]]
In [81]:
port = portfolio('portfolio', positions, val_env, underlyings, correlations)
In [82]:
port.get_statistics()
Out[82]:
position name quantity value currency pos_value pos_delta pos_vega
0 max_call_pos max_call_pos 3 6.833435 EUR 20.500305 {'jd': 1.19421, 'gbm': 1.84566} {'jd': 24.5532, 'gbm': 31.0476}
1 am_put_pos am_put_pos 2 3.780468 EUR 7.560936 1.27656 27.9452
2 min_put_pos min_put_pos 5 4.182950 EUR 20.914750 {'jd': -0.53865, 'sv': -1.0651} {'jd': -0.392, 'sv': 4.795}

3 rows × 8 columns

Let's have a look at the same two paths of two correlated underlyings.

In [83]:
path_no = 20
paths1 = port.underlying_objects['gbm'].get_instrument_values()[:, path_no]
paths2 = port.underlying_objects['jd'].get_instrument_values()[:, path_no]

plt.figure(figsize=(9, 5))
plt.plot(port.time_grid, paths1, 'r', label='gbm')
plt.plot(port.time_grid, paths2, 'b', label='jd')
plt.gcf().autofmt_xdate()
plt.legend(loc=0); plt.grid(True)
# negative correlation case

Market-based Pricing of VSTOXX Derivatives

Setting Everything Up

The necessary imports.

In [84]:
import numpy as np
import pandas as pd
import datetime as dt
import matplotlib.pyplot as plt
%matplotlib inline
from DX_Lib_Calibration import *

To illustrate the calibration of the Square Root Jump Diffusion model to VSTOXX call option data, we use real data from 28. December 2012.

In [85]:
term_structure = np.load('vstoxx_term_structure_28122012.npy')
option_data = np.load('vstoxx_option_data_28122012.npy')
In [86]:
len(option_data)  # number of call options
Out[86]:
39
In [87]:
option_data[:5]  # maturity, strike, quote
Out[87]:
array([[datetime.datetime(2013, 1, 16, 0, 0), 14.0, 7.45],
       [datetime.datetime(2013, 1, 16, 0, 0), 18.0, 3.6],
       [datetime.datetime(2013, 1, 16, 0, 0), 22.0, 1.4],
       [datetime.datetime(2013, 1, 16, 0, 0), 25.0, 0.75],
       [datetime.datetime(2013, 1, 16, 0, 0), 28.0, 0.45]], dtype=object)

The initialization for the calibration procedure.

In [88]:
r = constant_short_rate('r', 0.02)

srjd_me = market_environment('srjd_env', dt.datetime(2012, 12, 28))
srjd_me.add_curve('discount_curve', r)
srjd_me.add_curve('term_structure', term_structure)
srjd_me.add_curve('option_data', option_data)
srjd_me.add_constant('initial_value', 21.3534)
    # VSTOXX level on 28.12.2012
srjd_me.add_constant('kappa', 5.)
srjd_me.add_constant('theta', 20.)
srjd_me.add_constant('volatility', 3.)
srjd_me.add_constant('lambda', 0.4)
srjd_me.add_constant('mu', 0.4)
srjd_me.add_constant('delta', .1)
srjd_me.add_constant('final_date', dt.datetime(2014, 6, 30))
srjd_me.add_constant('maturity', dt.datetime(2014, 6, 30))
srjd_me.add_constant('model', 'srjd')
srjd_me.add_constant('frequency', 'B')
srjd_me.add_constant('paths', 1000)
srjd_me.add_constant('strike', 21.)

The instantiation of the calibration class.

In [89]:
cali = srjd_calibration('cali', srjd_me)

Term Structure Calibration

First the VSTOXX term structure based on the futures data.

In [90]:
# dummy parametrization
kappa, theta, volatility, lamb, mu, delt  = 2.0, 25., 3., 0.4, 0.3, 0.2
parameters = kappa, theta, volatility, lamb, mu, delt
In [91]:
cali.term_calibration(parameters=parameters)
Warning: Maximum number of function evaluations has been exceeded.

Let us have a look at the results.

In [92]:
plt.figure(figsize=(8, 4))
plt.subplot(211)
plt.plot(cali.forward_rates[:, 0], cali.forward_rates[:, 1], 'ro', label='Model')
plt.plot(term_structure[:, 0], term_structure[:, 1], label='Market')
plt.grid(True); plt.legend(loc=0)
plt.subplot(212)
plt.bar(cali.forward_rates[:, 0],
    term_structure[:, 1] - cali.forward_rates[:, 1], width=3, label='Differences')
plt.grid(True); plt.legend(loc=0)
Out[92]:
<matplotlib.legend.Legend at 0x10d9e7790>