Information Matrix
Filter
Posts feedThe importance of checklists
The World Health Organization (WHO) makes available a one-page checklist for use by surgical teams. The WHO claims that this checklist has made "significant reduction in both morbidity and mortality" and is "now used by a majority of surgical providers around the world". For example, the checklist is used by surgical teams in NHS England.
Kaplan-Meier for actuaries
In Richards & Macdonald (2024) we advocate that actuaries use the Kaplan-Meier estimate of the survival curve. This is not just because it is an excellent visual communication tool, but also because it is a particularly useful data-quality check.
Actively Beneficial?
How should we describe a lifestyle change that doubles our likelihood of suffering a major traffic accident? Oddly, evidence from Scotland suggests the answer is "worth making". Let me explain.
When is your Poisson model not a Poisson model?
The short answer for mortality work is that your Poisson model is never truly Poisson. The longer answer is that the true distribution has a similar likelihood, so you will get the same answer from treating it like Poisson. Your model is pseudo-Poisson, but not actually Poisson.
The fundamental 'atom' of mortality modelling
In a recent blog, I looked at the most fundamental unit of observation in a mortality study, namely an individual life. But is there such a thing as a fundamental unit of modelling mortality? In Macdonald & Richards (2024) we argue that there is, namely an infinitesimal Bernoulli trial based on the mortality hazard.
Don't fear the integral!
Actuaries denote with \({}_tp_x\) the probability that a life alive aged exactly \(x\) years will survive a further \(t\) years or more. The most basic result in survival analysis is the following relationship with the instantaneous mortality hazard, \(\mu_x\):
\[{}_tp_x = e^{-H_x(t)}\qquad(1)\]
where \(H_x(t)\) is the integrated hazard:
\[H_x(t) = \int_0^t\mu_{x+s}ds\qquad(2).\]
Seriatim data
In Macdonald & Richards (2024), Angus and I continue our long-standing advocacy for using individual records for mortality analysis, rather than grouped counts of lives. One argument in our paper is that the individual life is the most irreducible unit of observation in mortality analysis. After all, any group can be disaggregated into individuals, but further subdivision would just be dismemberment.
The product integral in practice
In a (much) earlier blog, Angus introduced the product-integral representation of the survival function:
\[{}_tp_x = \prod_0^t(1-\mu_{x+s}ds),\qquad(1)\]
The interrupted observation
A common approach to teaching students about mortality is to view survival as a Bernoulli trial over one year. This view proposes that, if a life alive now is aged \(x\), whether the life dies in the coming year is a Bernoulli trial with the probability of death equal to \(q_x\). With enough observations, one can estimate \(\hat q_x\), which is the basis of the life tables historically used by actuaries.
Smoothing
The late Iain Currie was a long-time advocate of smoothing certain parameters in mortality models. In an earlier blog he showed how smoothing parameters in the Lee-Carter model could improve the quality of the forecast. As Iain himself wrote, "this idea is not new" and traced its origins to Delwarde, Denuit & Eilers (2007).