Further reducing uncertainty

(Jun 6, 2016)

In a previous posting I looked at how using a well founded statistical model can improve the accuracy of estimated mortality rates.  We saw how the relative uncertainty for the estimate of \(\log \mu_{75.5}\) could be reduced from 20.5% to 3.9% by using a simple two-parameter Gompertz model:

\(\log \mu_x = \alpha + \beta x\qquad (1)\)

to "borrow" information at adjacent ages.  In the previous example we used just one year's data, whereas an obvious improvement would be to use the experience over multiple years to boost the data used.  Survival models for the force of mortality, \(\mu_x\), can easily be extended to cover multi-year data, although we still occasionally see invalid applications of GLMs for qx

Read more

Tags: estimation error, mis-estimation risk, survival models

Reducing uncertainty

(Nov 2, 2013)

The motto of the old UK Institute of Actuaries was certum ex incertis, i.e. certainty from uncertainty. I never particularly liked this motto - it implied that certainty can be obtained from uncertainty, whereas uncertainty is all-too-often overlooked. Fortunately, the merged Institute and Faculty of Actuaries picked a more sensible motto - e peritia ratio, i.e. reason from experience.

However, it is possible for uncertainty to be reduced, and one of the ways of doing this is with a properly constructed statistical model.  For example, consider the mortality experience in a single year for a pension scheme where 32 deaths are observed in the age interval [75, 76) with 1,092.85 life-years of exposure. The…

Read more

Tags: estimation error, survival models

Find by key-word

Find by date

Find by tag (show all )