Following the thread
Gavin recently explored the topic of threads and parallel processing. But what does this mean from a business perspective? Well, parallel processing can result in considerable speed increases for certain actuarial and statistical calculations. If done well, spreading the workload over four threads (say) can reduce the execution time to almost a quarter of its single-threaded equivalent. Many complicated actuarial calculations lend themselves well to multi-threading, and thus considerable reductions in run-times. A good example of this is simulation, which plays a major role in Solvency II work. To illustrate, Table 1 shows the execution time for 10,000 run-off simulations of a large annuity book.
Table 1. Execution times for 10,000 run-off simulations of a portfolio of 157,491 male and female annuitants. Source: Longevitas Ltd, using a dedicated eight-core server.
Number of threads |
Time taken | Performance factor relative to one thread |
---|---|---|
1 | 31 hrs 2 min | 1.0x |
4 | 7 hrs 55 min | 3.9x |
7 | 4 hrs 34 min | 6.8x |
Table 1 shows that more threads means that the same work can be done faster. It also shows that the performance scaling is almost linear: using four threads speeds things up by a factor of 3.9; using seven threads speeds things up by a factor of 6.8. In practice the full reduction in time cannot quite be achieved for a variety of reasons:
- There are parts of any process which cannot be parallelised, such as creating input files and preparing outputs.
- Not all calculations can be performed in parallel, for example calculations which involve using a previous result to generate the next one.
- There is overhead in creating and managing threads.
For these reasons it is only worthwhile threading some calculations, as not every operation will see a benefit. However, Table 1 shows that the benefits can be transforming: a calculation which takes more than a day can be reduced to one run over an afternoon. Since Gavin mentioned food so often in his blog, it is worth noting that using a sixteen-core server would further reduce the run-time in Table 1 to that of a long lunch!
How a business uses this performance improvement will vary. For example, where the number of simulations is fixed, a life office will simply be able to do the work faster (as in Table 1). However, insurers would often like to perform more simulations when estimating tail probabilities for Solvency II. Here, the benefits of parallelism would be used to do more work in the available time, and thus improve the quality of the tail estimates. In a world where actuaries have to perform a lot of heavy-duty calculations quickly, it is good to know that modern computer technologies have a solution to offer.
Add new comment