Choosing between models

In any model-fitting exercise you will be faced with choices. What shape of mortality curve to use? Which risk factors to include? How many size bands for benefit amount? In each case there is a balance to be struck between improving the model fit and making the model more complicated.

Our preferred method of measuring model fit is the log-likelihood function, but this on its own does not take account of model complexity. For example it is usually possible to make a model fit better - i.e. increase the log-likelihood value - by adding extra parameters and risk factors. But is this extra complexity justified? Are those extra parameters and risk factors earning their keep in the model?

There are a number of different test statistics which can be used to strike this balance: the Bayesian Information Criterion (BIC) is one, but our preference is Akaike's Information Criterion (AIC), which was proposed by Akaike in 1987. The definition of the AIC is:

AIC = -2 * log-likelihood + 2 * number of parameters

Straight away we can see that the AIC includes the usual measure of goodness of fit, namely the log-likelihood. However, it also includes the number of parameters, so it can balance improved model fit against complexity. For a given data set, then, the preferred model is the one with the lowest value of the AIC.  



Find by key-word


Epidemics and pandemics are, by definition, fast-moving and difficult to ... Read more
Ever since the unhappy arrival of the SARS-COV-2 virus, COVID-19 ... Read more
The former UK prime minister Harold Wilson famously said that ... Read more
Stephen Richards
Stephen Richards is the Managing Director of Longevitas
AIC and BIC in Longevitas
Both the AIC and BIC are displayed in the Model Overview section of each model report.