In the second article of this technical series we put the pieces together to begin estimating the value of a mortality swap, an esoteric insurance derivative.

The previous article in this series discussed the basic concepts of a mortality swap, and laid some basic groundwork and ideas for our approach to valuing such a contract.

To recap, the seller of the swap is guaranteeing payment of an income stream to the buyer of the swap, and the buyer is passing through all the received payments from the annuities. Thus the buyer of the swap will have a fixed and certain payment, and the seller takes on all the mortality risk associated with the life-contingent annuities. To place a value on this trade, we are going to use Monte Carlo simulation to simulate all the contingent payments, and then compare the two cashflows to determine the risk involved.

# Basic Assumptions

Before we get into the weeds of simulation, life tables and all the other associated detail, I want to state some basic assumptions to give us a solid sense of our starting point. Some of these assumptions can be relaxed later, and a few are motivated mainly to make the initial implementation simpler - once comfortable with the approach, we have lots of scope to add finer detail.

My basic starting assumptions:

1. No consideration of credit risk
2. The swap will be in force for a fixed lifetime, agreed upon at the start
3. All annuities are paid on an annual basis and at the same time
4. All annuities have a lifetime at least as long as the life time of the swap
5. The cost of capital is fixed over the lifetime of the swap
6. The swap only guarantees the APV of the annuity payments
7. All annuitants have undergone an independent underwriting evaluation and have been assigned a Mortality Underwriting Rating (MUR) as a result
8. A specific life table is agreed upon to calculate the Actuarial Present Value (APV) payments in the swap
9. The calculated premium does not account for any costs associated with setting up the swap, such as underwriting assessment etc.

The final assumption is important as different life tables will give different values for the APV payments being guaranteed, so the contract will require a specific life table.

Relaxing most of the above assumptions is reasonably straightforward in our approach, and is definitely a source of future work in this area. For example, switching payments to a monthly or quarterly basis should be simple, as is allowing for different annuities to have variable and one-off payments.

Such things just make implementing a pricing approach harder, so we just assume away those details for now by assuming a regular and equal payment schedule, aggregating all the payments into one at a single point of the year.

# The Portfolio of Annuities

Another hugely important input to this process is the portfolio of annuities itself. Rather than use a real-life portfolio and open up any potential issues with disclosure, I will instead generate some random data and use that.

In the portfolios I have seen, portfolio selection and underwriting has already been prepared and so we can just create it for ourselves using some simple distributions and then sanity-check it later. We will generate a portfolio of 200 annuities for annuitants with ages between 30 and 50, and we assume the yearly payments are between 10,000 - 100,000 USD per annum.

Also, we are not making any gender-based assumptions about mortality in this approach, so we do not need to consider the gender of annuitant, though this would definitely be a way to extend the model if desired.

Here is a sample of the mortality swap portfolio we will be using:

##     customerid  age  MUR    amount
## 1:   C97326460   39  350  $100,000 ## 2: C99971880 30 175$80,000
## 3:   C65134119   34  225   $50,000 ## 4: C35313144 33 200$90,000
## 5:   C17550793   45  200   $30,000 ## --- ## 196: C37440555 40 200$50,000
## 197: C60347677   42  200   $60,000 ## 198: C93383952 45 200$70,000
## 199: C08447895   35  150   $60,000 ## 200: C64003360 41 325$90,000

## NOTE: customerid is just a label, the "ids" are also randomly generated.


It is important to get a sense of the total dollar amounts we are dealing with, so we first do some simple exploratory work:

### We use Hadley Wickham's very useful 'devtools' package
### as it makes writing packages much easier, and we can
### just use local installs of the package.

require(devtools);
dev_mode(TRUE);
require(mcmortswap);

require(data.table);
requiry(ggplot2);

d> mortport.dt[, sum(amount)]
[1] 11560000

d> sum(rep(mortport.dt[, sum(amount)], 20) * exp(-0.03 * 1:20))
[1] 171263087


So, ignoring mortality, this amounts to just over 11.5MM USD in payments per annum for 20 years. The total present value of the portfolio (assuming the full amounts are paid each year), with a cost of capital of 3% per annum, is just over 171 MM USD.

Naturally, once we calculate the APV of the portfolio, we expect this will be lower. It is also worth quickly considering the cost of capital bring used. At the time of writing, long-term US Treasury yields are around 3%, so it is probably more realistic to assume that a higher cost of capital. 4% seems reasonable, so we will try that.

 d> sum(rep(mortport.dt[, sum(amount)], 20) * exp(-0.04 * 1:20))
[1] 155982269


So the cost of capital has a significant effect on the total value. While not directly relevant to what we are doing, a quick check of this relationship might be useful:

iter.rates <- seq(0, 0.2, by = 0.01);
pv <- sapply(iter.rates, function(r) sum(rep(mortport.dt[, sum(amount)], 20) * exp(-r * 1:20)));

qplot(iter.rates, pv, geom = 'line'
,xlab = 'Cost of Capital', ylab = 'PV of Portfolio'
,main = 'Plot of Portfolio NAV vs Cost of Capital') +
scale_y_continuous(label = dollar) +
expand_limits(y = 0);


We should also get a sense for the present value of the APV of the annuities, discounting by both interest rate and the mortality. For now, I will skip the full explanation of what the code is doing, since the implementation is quite detailed. I have created an R package mcmortswap which does all the heavy lifting and I will explain in more detail in future. This is the package I mentioned at my talk at R in Insurance 2015 and you can find the slides online here on SlideShare.

nonmort.lst <- calculate.mortality.swap.price(mortport.dt,
lifetable.dt, sim.lifetable = lifetable.dt, years.in.force = 20,
interest.rate = 0.04, hedge.apv.cashflows = TRUE, n.sim = 1,
return.all.data = TRUE);

guaranteed.annuity <- sum(sapply(nonmort.lst$valuations ,function(x) sum(x$fixed.discounted)))

print(guaranteed.annuity)
[1] 153514802


According to these calculations, the present value of the APV payments is only a little bit smaller than the present value of the full annuity portfolio, which is interesting. This may prove useful later as a diagnostic and sanity-check of our valuation output.

# Estimating the Effect of MUR on Curtate Probabilities

So how exactly do we use the MUR of each annuitant to determine the values of $q(x)$ to be used?

My initial thought was to just use the MUR as a multiplier for the curtate probabilities1 but after I little thought I realised this was not quite correct: in most cases, the underwriting value is a multiplier for the premium paid by the person and this is not necessarily the same thing as what we want.

So for a given MUR, what we want is the multiplier of the curtate probabilities that gives us the same premium. That way, we can draw an equivalence between MUR and the curtate multiplier.

We can do this in a few lines in R, though it does need a little explanation:

### Start with a 20-year policy for a 30-year old
A  <- 100000;
qx <- lifetable.dt[age >= 30]$qx[1:20]; r <- 0.04; MUR.values <- seq(0, 10, by = 0.25); ### This function calculates the difference in premium for a given ### MUR and curtate multiplier calculate.multiple.diff <- function(MUR, mult) { MUR * price.term.assurance(qx, A = A, r = r, P = 0) - price.term.assurance(mult * qx, A = A, r = r, P = 0); } ### For each value of MUR from 0 to 10 (0 to 1000) we find the value of mult ### that minimises this difference. MUR.mult.30 <- sapply(MUR.values, function(MUR) { optimize(function(mult) abs(calculate.multiple.diff(MUR, mult)), c(0, 20))$minimum;
});

MUR.mult.30.plot <- qplot(MUR.values, MUR.mult.30, geom = 'line',
xlab = "MUR", ylab = "Multiplier", ylim = c(0, 12)) +
ggtitle("30-yr Old MUR vs multiplier Plot - (red line is y=x) ") +
geom_line(aes(x = MUR.values, y = MUR.values), colour = 'red');

ggsave(MUR.mult.30.plot, file = 'MUR_mult_30yr.png', height = 10, width = 14);


The plot shows us that the curtate multiplier is always slightly larger than the MUR value (the plot is always slightly above the $y = x$ line) but the two values are close especially for MUR values from 0 to 500 - roughly our range of MUR values - so for the moment we will just assume they are the same.

Just to be safe, we do an equivalent calculation for a 45-year old:

### Now check for a 45-year old
A  <- 100000;
qx <- lifetable.dt[age >= 45]$qx[1:20]; r <- 0.04; MUR.values <- seq(0, 10, by = 0.25); ### This function calculates the difference in premium for a given ### MUR and curtate multiplier calculate.multiple.diff <- function(MUR, mult) { MUR * price.term.assurance(qx, A = A, r = r, P = 0) - price.term.assurance(mult * qx, A = A, r = r, P = 0); } ### For each value of MUR from 0 to 10 (0 to 1000) we find the value ### of mult that minimises this difference. MUR.mult.45 <- sapply(MUR.values, function(MUR) { optimize(function(mult) abs(calculate.multiple.diff(MUR, mult)), c(0, 20))$minimum;
});

MUR.mult.45.plot <- qplot(MUR.values, MUR.mult.45, geom = 'line',
xlab = "MUR", ylab = "Multiplier", ylim = c(0, 15)) +
ggtitle("45-yr Old MUR vs multiplier Plot - (red line is y=x) ") +
geom_line(aes(x = MUR.values, y = MUR.values), colour = 'red');

ggsave(MUR.mult.45.plot, file = 'MUR_mult_45yr.png', height = 10, width = 14);


If required, a more precise multiplier can be used by calculating this equivalent for every annuity in the portfolio. We use a policy of the same length as the time-in-force for the swap, and get an individual curtate multiplier for each person.

Another alternative is to have the underwriting also calculate this multiplier as part of that initial work, but I am unfamiliar with the feasibility of this. Another alternative would be to have a different expected life table for each annuitant so each has his or her own curtate mortalities. Such an approach may prove expensive to set up however.

Either way, this approach should give us a reasonable approximation for our purposes.

# Implementing the Monte Carlo Simulations

Almost there!

Everything is in place now to calculate the individual curtate probabilities used for each annuity, simulate the cashflows and perform all the calculations we need.

For each year, we will simulate if any of the contingent lives die, and then simulate the cashflow according to those annuities that are still paying. We then move onto the next year and repeat the process.2

We will then compare these life-contingent cashflows to the guaranteed cashflows, with the difference between the two being a simulated estimate of the distribution of the cost of the guarantee. We will then be able to use this data to calculate a cost for the guarantee.

# Summary

In this article we discussed the issue of using an annuity portfolio and in the absence of hard data, and showed how it is reasonably simple to create a useable portfolio for the purposes of pricing a mortality swap.

We then dealt with the issue of determining the curtate probabilities of death for each annuitant and outlined how all this will fit together.

The third part of this series will show some code that does all this, introducing the R package mcmortswap and will discuss some methods for determining a price to charge for assuming the mortality risk in this portfolio.

1. More accurately, MUR / 100 since MUR is expressed as an integer. Thus, 500 means 5 times the premium. MUR is often quoted as a loading multiplier that is then added to the premium, so a loading of 100 would mean a doubling of the premium. It seems different standards exist so it is important to clarify exactly what things mean.

2. Note that the actual code calculation will be done in a slightly different way for coding efficiency, but the aggregation of the simulation data gives us the same effect. We will discuss this more in Part 3.