Do we need standard tables any more?

Actuaries are long used to using standard tables. In the UK these are created by the Continuous Mortality Investigation Bureau (CMIB), and the use of certain tables is often prescribed in legislation. As actuaries increasingly move to using statistical models for mortality, it is perhaps natural that they should first consider incorporating standard tables into these models. But are standard tables necessary, or even useful, in such a context?

Although we normally prefer to model the force of mortality, here we will use a model for the rate of mortality, qx, since this is how many actuaries still approach mortality. Our model is actually a generalised linear model (GLM) where the rate of mortality is:

exp(α+βx)qx

where qx comes from the standard table and α and β are to be estimated. The table below shows some alternative models for a small annuity portfolio, together with the AIC as a measure of the goodness of model fit: the lower the AIC, the better the model. The standard table in question is PNA00, and the data is for the calendar year 2000, so both table and experience data are contemporary.

AICParametersModel
326481 Standard table unadjusted (α=0, β=0)
326682Standard table with gender-specific multipliers (α=0, β differs between males and females)
325983Standard table with age- and gender-specific multipliers (α and β vary by gender and age)
30973Logistic regression (GLM for qx using Perks Law)

As we can see from the first three rows, the best fit involving a standard table is to have age- and gender-varying parameters. However, the number of parameters is unwieldy, since each qx from the standard table is a variable. The mortality rates from the standard table are counted as parameters as they can obviously be varied by changing the standard table.  We are only counting the rates actually used, not the whole table.

As an alternative, we can fit a very simple GLM without reference to a standard table at all. Here we follow Richards and Jones (2004) by using logistic regression, i.e. where qx is assumed to follow the function form:

qx = exp(α+βx) / (1 + exp(α+βx))

The results of this model are shown in the fourth row in the table. It fits much better, and is far simpler in only having three parameters. Most insurers nowadays have more than enough information to create their own mortality tables, which have the benefit of fitting better and requiring fewer assumptions than standard tables.

 

Comments

James Rouse
(Sep 15, 2008)
Why PNA00? Wouldn't normal practise would be to find a table with the best curve shape and then apply multipliers?
Stephen Richards
(Sep 16, 2008)

I picked PNA00 because it was a contemporary table, but I could just as easily have picked PCA00 or some other modified table, such as PMA92mc2000 (say).  And, yes, you could try a number of different tables to find the least bad fit.

However, normal practice for fitting such statistical models in life offices does not involve using standard tables at all.  I write 'normal' because I know of no life offices who build statistical models this way.  Even the rather small portfolio used to illustrate here had enough data to build a better model without the standard table.

captcha

Find by key-word


RECENT POSTS

Favourite stories can, in the process of retelling, turn into ... Read more
For centuries, the life table has been at the centre ... Read more
Last week I presented at Longevity 14 in Amsterdam.  A ... Read more
Stephen Richards
Stephen Richards is the Managing Director of Longevitas
Table generation in Longevitas
Longevitas will automatically generate rate tables corresponding to each fitted model.  Alternatively, for complicated models, Longevitas can also generate a rate table for each life in the portfolio.  These individual-member rate tables are specific to the exact age and risk combination of each life.