Information Matrix

Filter Information matrix

Posts feed
Publication date

When is your Poisson model not a Poisson model?

The short answer for mortality work is that your Poisson model is never truly Poisson.  The longer answer is that the true distribution has a similar likelihood, so you will get the same answer from treating it like Poisson.  Your model is pseudo-Poisson, but not actually Poisson.

Written by: Stephen RichardsTags: Filter information matrix by tag: Poisson distribution, Filter information matrix by tag: survival models

The fundamental 'atom' of mortality modelling

In a recent blog, I looked at the most fundamental unit of observation in a mortality study, namely an individual life.  But is there such a thing as a fundamental unit of modelling mortality?  In Macdonald & Richards (2024) we argue that there is, namely an infinitesimal Bernoulli trial based on the mortality hazard.

Written by: Stephen RichardsTags: Filter information matrix by tag: survival models, Filter information matrix by tag: product integral

The interrupted observation

A common approach to teaching students about mortality is to view survival as a Bernoulli trial over one year. This view proposes that, if a life alive now is aged \(x\), whether the life dies in the coming year is a Bernoulli trial with the probability of death equal to \(q_x\).  With enough observations, one can estimate \(\hat q_x\), which is the basis of the life tables historically used by actuaries.

Written by: Stephen RichardsTags: Filter information matrix by tag: survival models, Filter information matrix by tag: right-censoring

Build versus buy

In an earlier blog I quoted extensively from "The Mythical Man-Month", a book by the distinguished software engineer Fred Brooks.  My blog was admittedly self-interested(!) when it cited arguments made by Brooks (and others) for when it makes sense to buy software instead of writing it yourself.  However in place of "buying

Written by: Stephen RichardsTags: Filter information matrix by tag: software, Filter information matrix by tag: ARIMA, Filter information matrix by tag: survival models, Filter information matrix by tag: left-truncation

A Model for Reporting Delays

In his recent blog Stephen described some empirical evidence in support of his practice of discarding the most recent six months' data, to reduce the effect of delays in reporting deaths. This blog demonstrates that the practice can also be justified theoretically in the survival modelling framework, although the choice of six months as the cut-off remains an empirical matter.

Written by: Angus MacdonaldTags: Filter information matrix by tag: survival models, Filter information matrix by tag: censoring

Less is More: when weakness is a strength

A mathematical model that obtains extensive and useful results from the fewest and weakest assumptions possible is a compelling example of the art. A survival model is a case in point. The only material assumption we make is the existence of a hazard rate, \(\mu_{x+t}\), a function of age \(x+t\) such that the probability of death in a short time \(dt\) after age \(x+t\), denoted by \({}_{dt}q_{x+t}\), is:

\[{}_{dt}q_{x+t} = \mu_{x+t}dt + o(dt)\qquad (1)\]

Written by: Angus MacdonaldTags: Filter information matrix by tag: survival models, Filter information matrix by tag: Poisson distribution

Stopping the clock on the Poisson process

"The true nature of the Poisson distribution will become apparent only in connection with the theory of stochastic processes\(\ldots\)"

Feller (1950)

Written by: Angus MacdonaldTags: Filter information matrix by tag: Poisson distribution, Filter information matrix by tag: survival models

Introducing the Product Integral

Of all the actuary's standard formulae derived from the life table, none is more important in survival modelling than:

\[{}_tp_x = \exp\left(-\int_0^t\mu_{s+s}ds\right).\qquad(1)\]

Written by: Angus MacdonaldTags: Filter information matrix by tag: survival models, Filter information matrix by tag: survival probability, Filter information matrix by tag: force of mortality, Filter information matrix by tag: product integral

Further reducing uncertainty

In a previous posting I looked at how using a well founded statistical model can improve the accuracy of estimated mortality rates. We saw how the relative uncertainty for the estimate of \(\log \mu_{75.5}\) could be reduced from 20.5% to 3.9% by using a simple two-parameter Gompertz model:

\(\log \mu_x = \alpha + \beta x\qquad (1)\)

Written by: Stephen RichardsTags: Filter information matrix by tag: estimation error, Filter information matrix by tag: mis-estimation risk, Filter information matrix by tag: survival models

Mind the gap!

Recognising and quantifying mortality differentials is what experience analysis is all about. Whether you calculate traditional A/E ratios, graduate raw rates by formula (Forfar et al. 1988), or fit a statistical model (Richards 2012), the aim is always to find risk factors influencing the level of mortality.

Written by: Kai KaufholdTags: Filter information matrix by tag: mortality convergence, Filter information matrix by tag: survival models