Information Matrix
Filter Information matrix
Posts feedHitting the target, but missing the point
Targeting methods are popular in some areas for mortality forecasting. One well known current example is the CMI's model for forecasting mortality.
VaR-iation by age
During the public discussions of our paper on value-at-risk for longevity trend risk, one commentator asked for a fuller presentation of VaR capital requirements by age. In the paper, as with our introductory overview, we used age 70 as a representative average age of an annuity portfolio.
Insurance or right?
The Economist recently carried an article about the perceived unfairness of increasing the retirement age. The argument is that poorer people have higher mortality rates, which means they get less value from a given pension than richer people: the poor are less likely to survive long enough to receive the pension, and if they do they will draw it for a shorter period of time.
VaR for longevity trend risk
Last month Stephen, Iain and Gavin presented their paper on putting longevity trend risk into a one-year, value-at-risk (VaR) framework. The presentations were made to audiences of actuaries in Edinburgh and London, and the video of the London debate is now available online.
Graduation
Graduation is the process whereby smooth mortality rates are created from crude mortality rates. Smoothness is an important part of graduation, but another is the extrapolation of mortality rates to ages at which data may be unreliable or even non-existent.
Correlation complications
A basic result in probability theory is that the variance of the sum of two random variables is not necessarily the same as the sum of their variances.
Discounting longevity trend risk
Establishing the capital requirement for longevity trend risk is a thorny problem for insurers with substantial pension or annuity payments.
Parallel (Va)R
One of our services, the Projections Toolkit, is a collaboration with Heriot-Watt University. Implementing stochastic projections can be a tricky business, so it is good to have the right people on the job.
Groups v. individuals
We have previously shown how survival models based around the force of mortality, μx, have the ability to use more of your data. We have also seen that attempting to use fractional years of exposure in a qx model can lead to potential mistakes. However, the Poisson distribution also uses μx, so why don't we use a Poisson model for the grouped count of deaths in each cell?
Following the thread
Gavin recently explored the topic of threads and parallel processing. But what does this mean from a business perspective?