It should come as no surprise that data is now considered the number 1 asset at financial services organisations. Yet most organisations continue to be slaves to their data – pouring vast amounts of resources and labor into structuring and managing an ever-growing volume of information and systems.

A small few, however, have started to rise above the complexity to become true masters of their data and, in doing so, have created a significant competitive advantage in their markets.

The data deluge

Let's face it: data underpins virtually every aspect of the financial services sector. Whether it is regulatory reporting, client onboarding, risk management or profit and loss forecasting, all enterprise processes and activities are reliant on data. No wonder, then, that financial services executives have become increasingly focused on their data management and infrastructure.

Unfortunately, many are fighting an uphill battle. According to most estimates, the quantity of data available to  businesses is on track to increase by around 40 percent every year for the foreseeable future. In financial services, a  large percentage of this increase has been driven by increased regulatory requirements. At the same time, the growing  complexity of financial services organisations combined with the increasing regulatory reporting burden in most  jurisdictions, has only ratcheted up the pressure for organisations to gain greater control and visibility into their data.

Spending lots but getting nowhere

Our experience suggests that few financial services organisations today – large or small – are getting even a fraction of  the potential value they could be from their data. Quite the opposite, in fact; many executives that we talk to suggest  they are pouring exponentially more resources into data-related activities than ever before, but getting only meager  returns for their investment.

In large part, this is because most financial services organisations are still too overly-reliant on manual processes and  interventions when it comes to collecting, processing and analysing data. This is especially true in the area of  compliance, where actionable data tends to sit in unstructured form and across a myriad of data sources and systems not  sufficiently integrated. And, as a result, many are finding that the increased demand for data skills and services is  driving a correlated increase in costs and headcount. They are also finding that throwing more bodies at the problem does  nothing to reduce error rates or improve data quality.

Letting value slip away

The cost impact of increased manual activities has, not surprisingly, led most financial services organisations to focus  their resources only on the data that offers immediate value. In doing so, they are leaving masses of potentially useful  data behind.

Consider this: while a typical International Swaps and Derivatives Association (ISDA) Master Agreement for trade activity  tends to contain between 500 and 700 possible data reference elements, most investment banks only capture between 100 and  200 data points. What this means is that every time there is an adverse event in the market (say a debt downgrade or  change in capital ratios, many of these organisations will need to go back to the source contract to identify and then  manually pull the data they need to reassess their exposure, an expensive and time-consuming proposition, indeed.

Data, data everywhere...

Another reason financial services institutions are fighting an uphill battle is that few – if any – are able to achieve a  'single view' of their data across their organisation. In part, this is due to decades of consolidation, mergers and regulatory-driven separations which have left most financial services organisations with a mess of internal systems and data management processes. And, as a result, most financial services organisations are now finding that their data is fractured and stuck in silos, inaccessible to the rest of the organisation.

Data governance, therefore, is also a massive obstacle, particularly within larger, more complex organisations. Thankfully, the past decade has seen this issue rise up the boardroom agenda to the point where we are seeing the emergence of a new corporate role – the chief data officer (CDO) – typically charged with creating an enterprise-wide data strategy, standards and policies. The CDO is expected to be the data champion to align and operationalise this strategy across the organisation, taking into account country-specific business and regulatory requirements for those that are operating in more than one jurisdiction. Yet much more must be done. Few CDOs have the necessary power to force lines of business into sharing their data and, as a result, data continues to be highly fragmented and difficult to access and work with.

Across the sector, the response to this challenge has been to centralise more and more data into (often outsourced) data warehouses. While the centralisation of data is certainly key to improving access and data flexibility, the reality is that this is a massive and continuous undertaking that requires organisations to know exactly how they expect to use their data 5 to 10 years in the future. Given the pace of regulatory change and the new innovations only now emerging from new analytics approaches, it would be near impossible for organisations to know what they will need from their data in the future.

The pressure mounts

Everybody knows that the status quo must change. The simple truth is that regulators and watchdogs are starting to demand better and higher quality reporting from financial institutions, often within much tighter timelines. Some regulators have gone beyond simply reviewing the quality of data in submitted reports and are now starting to circulate rules for how data should be handled with the organisation. Those able to get ahead of the regulator's scrutiny by creating and implementing a transparent and effective approach to data management will surely be better placed to meet shifting regulatory requirements in the future.

Most financial institutions also recognise that they can no longer continue to throw money and resources into fighting a losing battle. So while there is broad recognition that the rigors of requirements such as know your customer (KYC), anti-money laundering (AML) and Foreign Account Tax Compliance Act (FATCA) are only going to  increase with time, most also recognise that the root problem can never be solved just by adding more people or outsourcing more work. Something  must change.

A new approach emerges

We believe that the opportunity is already here. Over the past year or so, a new approach to data management and control  has emerged that allows organisations to truly become masters of their data.

The idea is actually quite simple: rather than tagging and locking away mountains of data into different systems,  organisations are instead starting to use big data technology that can 'crawl' through masses of both structured and  unstructured data (such as written contracts, media reports, transactions or market data) right across the organisation to  process and pull only the information required – regardless of the format.

Ultimately, this should allow organisations to leverage all of their data, no matter where in the organisation (or outside of it) the data resides or originated. Moreover, it also allows real-time access, meaning that organisations always have the most recent data available.

The benefits should be clear. Risk and finance would not disagree on financial results (as both would now be pulling from the same root data sets at the same time). A financial services organisation would not struggle to quantify its exposure to certain risks. And operations would not need to expand headcount or increase spending to respond to regulatory reporting requirements.

Though the current regulatory agenda is pre-occupying an outsised portion of financial institutions focus and resources, in due time this will be backward-looking. Those with a more innovative and competitive view will also recognise the massive upside available to those that are able to master their data in this way. Already, some are starting to use predictive analytics in their operations to reduce trading risk and improve customer interactions. Others are quickly identifying and measuring key lead indicators, uncovering new opportunities to grow their business and portfolios. And many are using this approach to cut across various regulatory reporting requirements by leveraging common data and policies.

Improving results and reducing costs

KPMG's proprietary data solution, for example, leverages big data approaches and KPMG's unique insight and business acumen to offer companies a clear roadmap to lowering costs while realising improvements that meet regulatory and compliance challenges, and support operational efficiencies.

This new solution platform is unlike other regulatory tools because it operates across multiple regulations, meaning that common data and pre- defined regulatory policies, developed in collaboration with KPMG's functional and regulatory subject matter experts, can be leveraged across client data to unleash the inherent cross-regulatory and cross-industry economies of scale in a way disassociated tools and workflow alone cannot. Today's technology allows organisations to combine data aggregation and search, intelligent data extraction, policy automation and efficient workflow processes with a speed, accuracy, completeness and unit price that would not have been possible just a few years ago.

When applied to areas such as client onboarding (a process that costs most tier 1 banks between US$50 million and US$70 million per year), we can help organisations deliver a more complete, accurate and cost-effective review process, improve the quality of their data and reporting, and reduce the costs of ongoing operations, maintenance and infrastructure.

Time for change

However, we also recognise that no business challenge can be solved by technology alone. Indeed, for financial services organisations to become true masters of their data, they will also need to put significant focus on changing the organisational culture, governance, processes and structure in a way that encourages data-driven decision-makingband the sharing of data, not just for satisfying today's regulatory demands, but to position the organisation for the future.

Most importantly, financial services organisations need to recognise that the environment has changed and that doing more of the same will be unsustainable over the long term. Those that are willing and able to take a new approach will rise above the fray to become true data masters. Those that cannot will ultimately find their costs – and complexity – choking their growth.

Clearly, it is time for a new approach.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.