# Information Matrix

## Filter Information matrix

Posts feed

### Functions of a random variable

Assume we have a random variable, $$X$$, with expected value $$\eta$$ and variance $$\sigma^2$$. Often we find ourselves wanting to know the expected value and variance of a function of that random variable, $$f(X)$$. Fortunately there are some workable approximations involving only $$\eta$$, $$\sigma^2$$ and the derivatives of $$f$$. In both cases we make use of a Taylor-series expansion of $$f(X)$$ around $$\eta$$:

$f(X)=\sum_{n=0}^\infty \frac{f^{(n)}(\eta)}{n!}(X-\eta)^n$

Written by: Stephen Richards

### Mortality by the book

Our book, Modelling Mortality with Actuarial Applications, will appear in Spring 2018.  I wrote the second of the three parts, where I describe the modelling and forecasting of aggregate mortality data, such as provided by the Office for National Statistics, the Human Mortality Database or indeed by any insurer whose own data is suitable.
Written by: Iain Currie

### Working with constraints

Regular readers of this blog will be aware of the importance of stochastic mortality models in insurance work.
Written by: Stephen Richards

### Out of line

Regular readers of this blog will be in no doubt of the advantages of survival models over models for the annual mortality rate, qx. However, what if an analyst wants to stick to the historical actuarial tradition of modelling annualised mortality rates?
Written by: Stephen Richards

### Groups v. individuals

We have previously shown how survival models based around the force of mortality, μx, have the ability to use more of your data.  We have also seen that attempting to use fractional years of exposure in a qx model can lead to potential mistakes. However, the Poisson distribution also uses μx, so why don't we use a Poisson model for the grouped count of deaths in each cell?
Written by: Stephen Richards

### Out for the count

In an earlier post we described a problem when fitting GLMs for qx over multiple years.  The key mistake is to divide up the period over which the individual was observed in a model for individual mortality.
Written by: Stephen Richards

### Logistical nightmares

A common Generalised Linear Model (GLM) for mortality modelling is logistic regression, also sometimes described as a Bernoulli GLM with a logistic link function.  This models mortality at the level of the individual, and models the rate of mortality over a single year.
Written by: Stephen Richards

### Great Expectations

When fitting statistical models, a number of features are commonly assumed by users. Chief amongst these assumptions is that the expected number of events according to the model will equal the actual number in the data. This strikes most people as a thoroughly reasonable expectation. Reasonable, but often wrong.

Written by: Stephen Richards

### Do we need standard tables any more?

Actuaries are long used to using standard tables. In the UK these are created by the Continuous Mortality Investigation Bureau (CMIB), and the use of certain tables is often prescribed in legislation. As actuaries increasingly move to using statistical models for mortality, it is perhaps natural that they should first consider incorporating standard tables into these models. But are standard tables necessary, or even useful, in such a context?

Written by: Stephen Richards

### Survival models v. GLMs?

At some point you may be challenged to decide whether to use survival models or the older generalised linear models (GLMs). You could be forgiven for thinking that the two were mutually exclusive, especially since some commercial commentators have tried to frame the debate that way.

Written by: Stephen Richards