Articles written by Stephen Richards
Last week I presented at the Longevity 18 conference. My topic was on robustifying stochastic mortality models when the calibrating data contain outliers, such as caused by the COVID-19 pandemic. A copy of the presentation can be downloaded here, which is based on a paper to be presented at an IFoA sessional meeting in November 20
All governments like to divert attention from their mistakes. However, this is tricky in an open democracy with a free press if those mistakes lead to tens of thousands of deaths. In contrast, it is straightforward for an authoritarian regime to manipulate the death counts. Nevertheless, it is hard to hide all the indirect consequences of excess deaths. This allows resourceful researchers to uncover what even dictatorships would rather keep hidden. In this blog we look at examples in China and Russia.
in Kleinow & Richards (2016, Table 5) we noted a seeming conundrum: the best-fitting ARIMA model for the time index in a Lee-Carter model also produced much higher value-at-risk (VaR) capital requirements for longevity trend risk. How could this be?
The R programming language has steadily increased in importance for actuaries. A marker for this importance is that knowledge of R is required for passing UK actuarial exams. R has many benefits, but one thing that native R lacked was an easy user interface for creating apps for others to use. Fortunately, this has changed with the release of libraries like Shiny, which we will demonstrate here in the context of an interactive mortality tracker.
The covid-19 pandemic caused mortality shocks in many countries, and these shocks severely impact the standard forecasting models used by actuaries. I previously showed how to robustify time-series models with a univariate index (Lee-Carter, APC) and those with a multivariate index (Cairns-Blake-Dowd, Tang-Li
In earlier blogs I discussed two techniques for handling outliers in mortality forecasting models:
Pricing block transactions is a high-stakes business. An insurer writing a bulk annuity has one chance to assess the price to charge for taking on pension liabilities. There is a lot to consider, but at least there is data to work with: for the economic assumptions like interest rates and inflation, the insurer has market prices. For the mortality basis, the insurer usually gets several years of mortality-experience data from the pension plan.
One interesting aspect of maximum-likelihood estimation is the common behaviour of estimators, regardless of the nature of the data and model. Recall that the maximum-likelihood estimate, \(\hat\theta\), is the value of a parameter \(\theta\) that maximises the likelihood function, \(L(\theta)\), or the log-likelihood function, \(\ell(\theta)=\log L(\theta)\). By way of example, consider the following three single-parameter distributions:
In mortality forecasting work we often deal with downward trends. It is often tempting to jump to the assumption of a linear trend, in part because this makes for easier mathematics. However, real-world phenomena are rarely purely linear, and the late Iain Currie advocated linear adjustment as means of judging linear-seeming patterns. This involves calculating a line between the first and last points, and deducting the line value at each data point t