ARTICLE
11 May 2009

Insurance Market Update- April 2009

Welcome to this April edition of the Insurance Market Update, in which we focus on issues in the life insurance industry. Since our last publication, many firms have been busy publishing their preliminary 2008 results and continue to deal with challenging market conditions.
United Kingdom Insurance

Welcome to this April edition of the Insurance Market Update, in which we focus on issues in the life insurance industry. Since our last publication, many firms have been busy publishing their preliminary 2008 results and continue to deal with challenging market conditions.

Despite the current economic environment, Solvency II remains high on many insurers' agendas as they begin their planning activities in earnest and consider whether they are going to take part in the FSA's dry run of the internal models approval process. As a result, our first article in this month's edition looks at Solvency II. It is a back to basics piece highlighting the key areas of focus for many firms over the coming months and the wider impact of Solvency II on insurance companies. Our second article looks at data quality and the business benefits of improving data quality. In particular, we look at the implications from a Solvency II perspective.

We hope you find this edition informative and, as always, your comments and suggestions for future themes or topics are welcome.

Brian Robinson
Acting Editor

Solvency II – Current focus and wider business impact

Solvency II is set to change the way the insurance sector operates. The framework, which is expected to be introduced across the European Union in 2012, will mean significant changes in everything from the way a firm calculates its capital requirements to its strategy for product development and pricing.

Preparing for Solvency II

But while 2012 may seem a long way off, and some of the detail is still to be finalised, the nature of the changes required to comply with the Solvency II requirements means firms should already be preparing for the regulatory change and have effective implementation plans in place.

To emphasise the importance of early engagement, the Financial Services Authority (FSA) is inviting firms to meet a number of deadlines as 2012 approaches, with the first of these set for later this year.

Under these initial deadlines, the FSA expects firms to provide an update on their implementation planning, giving details of who will be overseeing the process. It is also asking firms to provide details of any gap analysis performed or when they are planning to undertake their analysis to identify the extent of the work required to achieve Solvency II compliance. To achieve this, firms will need to have thought about what the new regime means to their business, how they would like to operate under the new requirements and what they need to do to satisfy them.

Another key deadline this year will be in June. At this point the FSA will expect firms to have decided whether they want to take part in the first dry run of the internal models approval process. This is particularly important. Under Solvency II, firms will be able to choose between a standard model or seek approval for an internal model that more accurately reflects the risks involved in their activities.

But making this decision now is critical. Although the FSA is offering two waves of trials for its approval process, it is only guaranteeing a decision on approval in time for the 2012 implementation date for those firms that are in the first wave. Waiting until the second wave, which is expected late in 2011, could mean having to adopt the standard model temporarily before approval is granted.

How Solvency II will affect business

Although the focus on capital may make Solvency II seem an exercise based in the finance, actuarial and risk departments of a firm, its ramifications will be felt throughout the business. As well as operational changes affecting everything from reporting to product sales, it will also lead to cultural changes affecting the day-to-day workings of the firm.

A company's strategy about which products to sell and their design will be influenced by the new regime. Firms will need to understand the risk, value and capital implications of selling particular products and ensure that this is aligned with their risk appetite as well as being reflected in their pricing and business planning. As a result, firms will have a clearer vision of the amount of business they can write which could see them changing their product mix in line with their risk appetite. It should also enable firms to be more proactive in optimising diversification benefits across different products.

Changes to capital requirements will also provide the impetus to review location and structure of insurance operations, where tax considerations are highly relevant. Life companies may also be impacted if changes in FSA returns lead to changes in company and product taxation.

Solvency II will also drive greater transparency in the sector. Firms will be required to disclose much more information to their stakeholders and potential investors. While this greater openness will be a significant change for many firms, those prepared to embrace it will find it offers opportunities to demonstrate the strengths of their business.

For many firms Solvency II will be the catalyst for making other changes. As an example, following years of acquisition and merger activity and reflecting product development over the years, most life assurance firms are sat on a number of legacy IT systems.

Firms will need to understand the risk, value and capital implications of selling particular products and ensure that this is aligned with their risk appetite as well as being reflected in their pricing and business planning.

These can be unwieldy and cause data and reporting problems. Meeting data standards and reporting timelines required under Solvency II will put further pressure on them and some firms may decide that this justifies upgrading their technology.

On top of this, firms will also need to invest in training to help embed the new requirements into their business. This may also mean recruiting additional employees to reflect the shift in processes.

How much a firm adopts the new regime, and when, will vary. While some are already well advanced in their planning and looking to implement some of the processes ahead of October 2012, others have come to the planning later.

While there is time to catch up, for those companies well prepared and taking a comprehensive approach to adopting the new requirements there are greater rewards. These include the reputational benefits of being an early adopter, or the business benefits of exceeding the compliance requirements. Furthermore, as the new regime becomes more established, those companies that only do sufficient to comply could find it becomes harder and harder to compete.

Rick Lester

Data quality

Data can have significant impacts on the effectiveness of business processes, and on the working capital available to the organisation. Understanding and addressing data quality issues will drive efficiency benefits and reduce the costs which are associated with poor data.

We therefore believe that there is a strong business case for reviewing and enhancing data quality, and in this article we seek to identify the key drivers for improving data quality and what actions companies can take to achieve this.

The importance of data quality

Data only exists to support business applications and can therefore only be defined in light of the purposes it is designed to satisfy. Data quality should thus be measured in terms of its fitness for specific business purposes, both in terms of its:

  • existence – having all the data that is needed; and
  • accuracy – the data that is held accurately represents the underlying business.

For many insurance organisations data quality has long been an issue. Few organisations collect or maintain data that is able to truly and completely meet all business demands and standards. There are a number of reasons for this including merger and acquisition activity, failure to decommission legacy systems as well as heavy use of spreadsheets for financial and actuarial calculations and modelling.

Data quality is often poorest in the operational systems used for individual customer and supplier records, and daily transactions. Unfortunately this is where the impact of poor operational data is largest and will affect the quality of all downstream business activities including statutory and regulatory reporting, sales and marketing activities, asset valuations, management accounting and reporting.

For many companies, data collected for one purpose often ends up being used in many different ways. Consequently, data quality is a complicated and constantly evolving business problem.

Developing the business case for improving data quality

Despite the importance of data quality to a business it is often hard to truly articulate the business case for improvement beyond the regulatory requirements. Set out below are some of the reasons why improving the quality of data can be of benefit (including the regulatory concerns).

Regulatory requirements

As part of the build up to Solvency II, industry and regulatory bodies are placing increasing emphasis on data quality. Solvency II will require insurance organisations to demonstrate more rigour in data management, and be able to justify assumptions made within Internal Models.

In DP08/04 the FSA suggested that the current quality of data in many UK firms may fall short of both existing and Solvency II standards (Reference: The path to Solvency II (September 2008)). This was supported by the CEIOPS stock-taking report on the use of internal models in insurance which noted that several participants regarded data as one of their most challenging issues, and suggested that even in well developed databases man-made errors are common. In DP08/04 the FSA further recommended that firms review data collection processes, data quality, access, resolution and storage.

We are aware that many companies are carrying out a gap analysis of their current position against the expected Solvency II requirements. Perceived shortfalls would be expected to feed into more granular analysis and development planning.

Business benefits

Addressing data quality issues can improve a variety of business processes. Improving data quality reduces the time it takes to produce analysis and results, whether for statutory or management reporting, capital calculations (such as the ICA), or for investigative or reconciliation work (such as Profit Driver Analysis or Analysis of Change).

Many actuarial and financial reporting departments have concerns over how much time is spent on manually understanding and "cleansing" data. A recent survey by the Data Quality Working Party of the General Insurance Research Organisation suggested that "actuaries working in general insurance spend on average 27% of their time on data quality issues, and that 34% of projects undertaken by them are adversely affected by data quality issues". Significant time saving can be made by improving underlying data quality, processes and controls.

Poor data quality can distort experience analysis, which can lead to inappropriate bases and assumptions being used in valuations. This can result in the misstatement of liabilities and capital requirements. In addition, as products become more tailored to customers and valuation methodologies become more detailed, new data fields may need to be captured to enable accurate policy valuations.

Enhancing policy administration systems and processes to capture more granular data allows insurers to more effectively manage the performance of distribution channels and sales agents on a new business and persistency basis. In addition, other business benefits can include faster data migration and consolidation, improved enterprise level reporting and enhanced transparency.

Delivering data for Solvency II

We suggest a two-stage approach to delivering data for Solvency II and any other reporting process that meets the required regulatory standards and business requirements. Simply put, the approach is to improve existing data quality and then establish controls to maintain the data quality.

Improve data quality

A natural first step to improving the quality of data involves an initial assessment of the data. A potential approach is to profile the key data fields required for regulatory capital computations, management information etc., and data profiling technology can be used for this initial analysis. Once the profiling is complete, an impact analysis could be carried out to estimate (for example) the additional capital the organisation is required to hold due to poor data quality.

The final step to assessing data quality should be to establish how your current status compares to your target position. At this point it is also worthwhile to address the establishment (or improvement) of an appropriate data governance framework with metrics that can be used to monitor data quality.

The data issues identified by the quality assessment can then be remediated to make the data fit for purpose. The data quality assessment will act as an enabler to the remediation phase as it will identify the deficiencies in the data and therefore define where the remediation action is required.

Maintain data quality

Controls are vital to Solvency II compliance and controlling the ongoing data quality. An initial assessment of controls is needed to evaluate the level of internal control, monitoring and governance that is in place over the systems supporting current and future Solvency II data processes and calculations. Controls should be considered holistically i.e. people, process and system level controls. Having completed the assessment, new controls may need implementing and others may need remediation or retirement.

Controls assessment will provide guidance on the most common risks associated with holding and moving data including completeness, verification, accuracy, validity and cut-off (timeliness).

Conclusion

The successful implementation of Solvency II poses numerous challenges to insurers, and data quality is significant amongst these.

Insurers are also operating in an uncertain economic environment, where the efficient and effective management of their financial position is vital. Assurance often focuses on regulation and risk modelling, but issues tend to arise from a lack of governance and control, and poor system and data integration. Data quality issues can lead directly to inaccurate valuations, to increases in capital requirements, and to inefficiencies in business processes.

Thus, there are significant business benefits to the enhancement of data quality, in addition to the regulatory drivers presented by Solvency II.

In light of the continued focus on data quality by the industry we hope this article will be of interest and benefit.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More