Remembrance of Things Past

(Jun 13, 2013)

With all due respect to Marcel Proust, involuntary memory isn't something I particularly worry about. Now, involuntary forgetting on the other hand, that is something I'll admit to, and I'm fairly sure I'm not alone. We rarely lose the flashbulb moments, but the problems happen when we don't realise that flashbulb ought to have fired until weeks or months later, at which point significant events risk being lost in the fog.

In software terms, one definition of "memory" is what we record in an audit trail. This concept of a chronological record of activities isn't just an important plank in a comprehensive security strategy (sometimes called defence in depth), but our belief is that it can also add value in everyday…

Read more

Tags: audit, technology

Parallel (Va)R

(Oct 11, 2012)

One of our services, the Projections Toolkit, is a collaboration with Heriot-Watt University.  Implementing stochastic projections can be a tricky business, so it is good to have the right people on the job.  Although our development platform is traditionally Java and C++ based, one consequence of our collaboration is that parts of the system are now written in R.

R has a number of notable positive attributes, including thriving development and user communities, powerful graphics capabilities, expressive language features and a comprehensive module library.  However, blistering raw performance, and more specifically multi-processor performance, are not features standard R would lay claim to.  Standard…

Read more

Tags: parallel processing, technology

Following the thread

(Sep 18, 2012)

Gavin recently explored the topic of threads and parallel processing.  But what does this mean from a business perspective?  Well, parallel processing can result in considerable speed increases for certain actuarial and statistical calculations. If done well, spreading the workload over four threads (say) can reduce the execution time to almost a quarter of its single-threaded equivalent. Many complicated actuarial calculations lend themselves well to multi-threading, and thus considerable reductions in run-times.  A good example of this is simulation, which plays a major role in Solvency II work.  To illustrate, Table 1 shows the execution time for 10,000 run-off simulations of a large annuity…

Read more

Tags: threads, parallel processing, simulation, Solvency II, technology

Competitive eating

(Sep 17, 2012)

I've previously suggested parallel processing might have a touch of the infernal about it, and further evidence might be how it allows us to usefully indulge in one of the seven deadly sins, that of gluttony. Interestingly this link between concurrency and food was also explored in a famous problem in software design, that of the "Dining philosophers".  Here some hungry philosophers with insufficient forks must compete for utensils in such a way as to avoid deadlocks, race conditions and ultimately starvation.  This problem is a reminder of how tricky it can be to safely mediate concurrent access to resources.

Software developers have always had a responsibility to consider how system resources will be consumed.…

Read more

Tags: parallel processing, technology

Special Assignment

(Sep 14, 2011)

We talked previously about the use of user-defined validation rules to clean up specific data artefacts you sometimes find in portfolio data. One question came up recently about modelling bespoke benefit bands, and this can also benefit from user-defined rules.

In our modelling system we automatically calculate a user-selected number of benefit bands, each containing a broadly equal number of lives. The model optimiser can be used to cluster these bands, giving you the best-fitting break points for your experience data. A drawback is that the optimised break-points might not correspond to any pre-established business convention. So, what do you do if you want a constant banding for use with all files?

One…

Read more

Tags: technology, data validation, deduplication

Keep taking the tablets

(May 9, 2011)

Earlier Gavin wrote about a number of mobile devices from which you could run Longevitas software services, including a Nokia telephone and an iPod Touch.  This is not a result of specifically designing for these devices, but it is a handy benefit from following the open, published standards for web development.

As a further illustration of this, you can also run our services from one of the most iconic mobile devices, namely the BlackBerry. The picture in Figure 1 shows a BlackBerry 8520 running our flagship Longevitas survival-modelling software.  We're not pretending that this is terribly practical, though, as the BlackBerry screen is rather small.

Figure 1. BlackBerry running Longevitas survival-modelling…

Read more

Tags: technology, mobile access, BlackBerry, iPad

Rewriting the rulebook

(Dec 2, 2010)

It is an unfortunate fact of life that through time every portfolio will acquire data artefacts that make risk analysis trickier. Policyholder duplication is one example of this and archival of claims breaking the time-series is another. Data errors introduced by servicing are perhaps the most commonplace of all, and this posting describes how validation rules can protect the modelling stage from such errors.

The first class of issue is the generic data corruption, termed generic because these problems occur with the same characteristics in more or less every portfolio you work with. Generic validation rules are critical here, screening out such problems before modelling commences. These issues include…

Read more

Tags: technology, data validation

Upwardly Mobile

(Feb 28, 2010)

We recently discussed the ways server-based modelling software facilitates collaboration across boundaries. Another important boundary is the office wall, although what was once considered an impermeable divide between work and the rest of our lives, is nowadays all-too porous. For most of us, there is no shoring up the dam; work will continue to bleed into our social lives, so we might as well take some pleasure in those developments that remove some of the pain. 

So perhaps you're on a train - going to a party, or a family event, or in some other way in danger of having a life.  But then you realise you forgot to schedule some modelling. And you need a complex optimisation on a large annuity book. You wanted it to run…

Read more

Tags: technology, collaboration, mobile access

A Problem Shared

(Jan 27, 2010)

Creating a good model from your experience data is not always straightforward. Data gathered over an extended time period, as financial portfolios usually are, will often incorporate artefacts from more than one administration system or servicing team, and it may require wide-ranging business experience to make the best analytical choices.

Perhaps for this reason, our clients often collaborate on modelling projects between offices and sometimes with external partners. At present, we know of one project involving analysts in both Canada and Bermuda, and another where the team spans no fewer than three organisations - two in France, and one in the UK. A server based approach makes such collaboration easier,…

Read more

Tags: technology, collaboration

Personal Standards

(Nov 30, 2009)

Love them or loathe them, actuaries cannot get by without standard tables in some shape or form. Even when performing analysis of your own experience data to avoid basis risk, standard tables are often used as a kind of lingua franca between parties, a convenient way to express approximate results in a way everyone can understand.

The use of standard tables in experience analysis brings its own issues of course: Mortality moves on after a table is published, and even those that are not too outdated may still be innappropriate for the population under study. In any case the rates from a standard table will often require significant transformation in order to achieve something vaguely similar to your actual experience.…

Read more

Tags: technology, standard table

Older Posts »

Find by key-word


Find by date


Find by tag (show all )