### A Model for Reporting Delays

#### (Jan 28, 2021)

In his recent blog Stephen described some empirical evidence in support of his practice of discarding the most recent six months' data, to reduce the effect of delays in reporting deaths.  This blog demonstrates that the practice can also be justified theoretically in the survival modelling framework, although the choice of six months as the cut-off remains an empirical matter.

In our recent book, Modelling Mortality with Actuarial Applications, we introduce multiple-state models, and that is the framework we need here. Consider the model in Figure 1 for the death of a person observed from age $$x$$, denoted by ($$x$$), with a delay in the death being reported.

Figure 1. A model of the mortality of ($$x$$) with…

Tags: survival models, censoring

### Less is More: when weakness is a strength

#### (Jun 1, 2018)

A mathematical model that obtains extensive and useful results from the fewest and weakest assumptions possible is a compelling example of the art.  A survival model is a case in point.  The only material assumption we make is the existence of a hazard rate, $$\mu_{x+t}$$, a function of age $$x+t$$ such that the probability of death in a short time $$dt$$ after age $$x+t$$, denoted by $${}_{dt}q_{x+t}$$, is:

${}_{dt}q_{x+t} = \mu_{x+t}dt + o(dt)\qquad (1)$

(see Stephen's earlier blog on this topic).  It would be hard to think of a weaker mathematical description of mortality as an age-related process.  But from it much follows:

• If we observe a life age $$x_i$$ for a time $$t_i$$, and define $$d_i = 1$$ if the…

### Introducing the Product Integral

#### (Feb 26, 2018)

Of all the actuary's standard formulae derived from the life table, none is more important in survival modelling than:

${}_tp_x = \exp\left(-\int_0^t\mu_{s+s}ds\right).\qquad(1)$

Stephen covered the derivation of this in a previous blog, but I want to look more closely at the right-hand side of equation (1).  In particular, we can find an entirely different representation of $${}_tp_x$$ as a product integral, which leads to many insights in survival models.

Recall how the integral in equation (1) is constructed.  Choose a partition of the interval $$[0,t]$$, that is some sequence $$\Delta_1,\Delta_2,\ldots,\Delta_n$$ of non-overlapping sub-intervals that exactly cover the interval.  Define…

### Further reducing uncertainty

#### (Jun 6, 2016)

In a previous posting I looked at how using a well founded statistical model can improve the accuracy of estimated mortality rates.  We saw how the relative uncertainty for the estimate of $$\log \mu_{75.5}$$ could be reduced from 20.5% to 3.9% by using a simple two-parameter Gompertz model:

$$\log \mu_x = \alpha + \beta x\qquad (1)$$

to "borrow" information at adjacent ages.  In the previous example we used just one year's data, whereas an obvious improvement would be to use the experience over multiple years to boost the data used.  Survival models for the force of mortality, $$\mu_x$$, can easily be extended to cover multi-year data, although we still occasionally see invalid applications of GLMs for qx

### Mind the gap!

#### (Nov 20, 2013)

Recognising and quantifying mortality differentials is what experience analysis is all about. Whether you calculate traditional A/E ratios, graduate raw rates by formula (Forfar et al. 1988), or fit a statistical model (Richards 2012), the aim is always to find risk factors influencing the level of mortality.

Many such differentials are well-known and anticipated: females v. males, smokers v. non-smokers, healthy v. ill, rich v. poor. Each of these pairings has differentials large enough to merit their own mortality tables. And even where courts and politicians have regulated against straightforward pricing of observable differentials, such risk factors should still be acknowledged in reserving…

### Reducing uncertainty

#### (Nov 2, 2013)

The motto of the old UK Institute of Actuaries was certum ex incertis, i.e. certainty from uncertainty. I never particularly liked this motto - it implied that certainty can be obtained from uncertainty, whereas uncertainty is all-too-often overlooked. Fortunately, the merged Institute and Faculty of Actuaries picked a more sensible motto - e peritia ratio, i.e. reason from experience.

However, it is possible for uncertainty to be reduced, and one of the ways of doing this is with a properly constructed statistical model.  For example, consider the mortality experience in a single year for a pension scheme where 32 deaths are observed in the age interval [75, 76) with 1,092.85 life-years of exposure. The…

### Enhancement

#### (Jun 1, 2013)

An oft-overlooked aspect of statistical models is that parameters are dependent on each other.  Ignoring such dependencies can have important consequences, and in extreme cases can even undermine assumptions for a forecasting model.  However, in the case of a regression model the correlations between regressor variables can sometimes have some unexpectedly positive results.  To illustrate this, consider a sequence of fits of a survival model for a Makeham-Perks mortality law (Richards, 2008) defined as follows:

μx = [exp(ε) + exp(α + βx)] / [1 + exp(α + βx)]

where the parameter α is allowed to vary by gender, health status at retirement, or both.  The results…

### The ins and outs of bulk annuities

#### (Apr 14, 2013)

The UK has a well developed and highly competitive market in bulk annuities. These typically arise when a defined-benefit pension scheme wants to insure its liabilities. The most obvious scenario is when a pension scheme is being wound up and benefits have to be secured with an insurance company.  Since the pension scheme is ceasing to exist, individual policies are purchased for each member. The now-former pension-scheme member owns his or her own annuity policy, which cannot be surrendered or transferred once annuity payments start.  All risks such as longevity and investment are transferred to the insurer in a transaction known as a buy-out.

However, there is another option called a buy-in - the scheme…

### Groups v. individuals

#### (Sep 28, 2012)

We have previously shown how survival models based around the force of mortality, μx, have the ability to use more of your data.  We have also seen that attempting to use fractional years of exposure in a qx model can lead to potential mistakes. However, the Poisson distribution also uses μx, so why don't we use a Poisson model for the grouped count of deaths in each cell?  After all, a model using grouped counts sounds like it might fit faster.  In this article we will show why survival models constructed at the level of the individual are still preferable.

The first step when using the Poisson model is to decide on the width of the age interval.  This is necessary because the Poisson model for grouped counts…

### An early bath for the bathtub model

#### (Jun 6, 2012)

My last posting looked at why actuaries fitted survival models differently to statisticians, even though the conceptual framework for survival models is common to both disciplines.  In this posting we look at why actuaries and demographers use different models from engineers.

Survival analysis plays an important role in reliability engineering, and survival models are used to model the time to failure of a component or device. Mathematically this is the same as modelling the time to death of an individual, although engineers use different terminology from actuaries and demographers. However, although the mathematics is identical, the shape of the hazard function is typically very different: mechanical…