On 9 February 2016, the PRA published a set of findings from their data review as part of firms' internal model approval process (IMAP) submissions1.

This report builds upon the interim report that was issued back in 2012 and includes analysis from over 50 firms in the IMAP.

Firms still undergoing the IMAP submission should use the findings from the PRA report as a good benchmark when embedding activities to comply with the Data Quality requirements as part of the Solvency II2. However, all firms can use these findings as a reference point to improve their wider data governance framework not only to support regulatory compliance, but also as a basis for good practice in a world of big data and analytics.

In summary, the paper highlighted 10 different areas where the PRA considered firms still faced challenges on assessing and controlling the quality of the data supporting the internal model.

It was noted that since 2012, gradual improvement has taken place to embed data governance into business as usual. In addition, most firms have resolved issues around complex IT systems supporting data governance and the practice of standard impact and risk assessment was in place and operational.

The following are some areas that firms should consider looking into:

  1. Confirm types of data that are in scope for data quality reporting: Firms had an increased focus on exposure data (e.g. policy and asset data) rather than valuation or risk data (e.g. spread movements, price history and stresses). The PRA consider there are elements of risk data and information used for valuation which form a more significant part of the internal model than simply the exposure data. Our experience has shown the focus the PRA have placed on risk data and its application.
  2. Use a data directory is used as a tool to support data governance: Across all Solvency II processes, a data directory should act as a source of control information for all data quality processes. Firms struggled to define or use the data directory in a manner that was intended for Solvency II. A functional data directory should provide information for each in-scope process, the source and how it used and additional data characteristics. In our experience, we have noticed organisations incorrectly using a data dictionary as a data directory. A data directory contains more of a detailed breakdown of all in scope attributes.
  3. Assess data ownership across the end to end data process: This has remained a challenge for many firms, as without appropriate identification and engagement of owners ranging from source to usage, the data governance operating model will not be effective. This is important as consistent application of data management processes can only be successful when owners have a broad understanding of the data flow and when owners have knowledge of an actionable process to quantify errors and escalate them for timely resolution.
  4. Confirm assumptions around materiality: The approach across the various firms varied and the identification of materiality remained a challenge. Similar to the findings from the PRA, many of the firms we have engaged with have also indicated that the difficulty lies when changes in business model or risk profile increase the materiality of a product in the future. There are also challenges when data models undergo transformations, calibration and aggregation, as instances of immaterial data error can be compounded to have a material impact.  Firms should use more advanced techniques for stressing specific data items to identify their materiality and inform the application of controls and process improvements.
  5. Re-assess the data flows and ensure that all key transformation steps and associated controls are correctly identified: Firms provided data flows showing incomplete logical flow of the various data calculations and it excluded steps around key assumptions and joining of data sets. An incomplete data flow can lead to inconsistent identification and design of appropriate data quality controls and it can also impact the scoping and testing of the data audits. Another key challenge on data quality controls remain with firms struggling to provide the reasoning and evidence of the required controls. Whilst firms were able to provide evidence of the control being in operation (i.e. four eyes check or exception reporting), they were not able to show the end to end process flow, the associated risk and impact and the required control characteristics. Finally, the PRA also noted that additional steps can be undertaken to improve the data quality controls framework on areas such as appropriateness and third party data. Banking approaches to data quality can provide significant benefits to insurers where the implementation of Basel Pillar III required banks to have an enhanced data quality framework to deliver a basis from which to evidence data governance. Such additional items such as uniqueness and validity can be complementary to the existing SII accuracy, completeness and appropriateness. 

Next steps

In our view, the findings had a lot of focus around the embedding of data governance process that is practical and works operationally. Most findings related to ownership, communication, issue identification and remediation, materiality and data quality controls. All of these are principles of a standard data governance framework. It is important for firms to treat these findings as a new benchmark for data governance and reflect them in their future data plans.

Footnotes

1 Solvency II: internal model approval process data review findings
( http://www.bankofengland.co.uk/pra/Documents/solvency2/imapdata09022016.pdf)

2 Directive chapter VI, section 2, 4 and 5

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.