What is the impact of data exceptions on regulatory reporting?

Impact of data exceptions on regulatory reporting

The big assumption

With data sitting at the heart of any regulation, successfully meeting regulatory reporting requirements means having the right data available, in the right format, at the right time. This is a big assumption given that less than 50% of data received by institutions is ready to process without intervention. With so much incoming data not fit-for-purpose, the problem is exacerbated by data fragmentation; a good example lies in the number of reference data masters firms typically have, with 50% having more than 11 and nearly 20% having more than 20. This means data quality across all firms and functions is a constant challenge for any required reporting.

Data exception: the missing link

Sorting out data quality has typically been considered the purview of the back office, but this is changing. A survey, conducted in February this year by Adox Research, showed new powerful stakeholders in customer-facing, compliance and other front-line functions are targeting data exceptions as the missing link to better service and reduce regulatory risk.

Respondents pointed to oversight and compliance (covering risk management, regulatory reporting and other control functions) and client-facing functions (onboarding, sales etc.), as the areas where better data exception management would have the most impact in the investment lifecycle.

As the report says, financial data needs to meet several quality criteria before it becomes useful and useable:

  • Complete and timely
  • Consistent and consensual
  • Accessible formats and feeds
  • Calculation and context

It seems firms are starting to recognise the scale of the problem. Missing or late data is a daily problem for 31% of firms and, while the challenge of data delivery basics are clear, generating derived, manufactured or calculated data is creating by far the highest number of exceptions with more than a third needing to manually intervene in data for risk, pricing and performance on a daily basis.

So where to start?

Fix the most important data first

Fixing data across firms has typically focussed on reference and related data sets, but the Adox survey shows firms now need to shift that focus to more business-relevant and higher-value data sets such as regulatory data and risk ratings. Firms identified these two areas as being the most impacted by low quality data and this new focus on high-value and differentiated data sets shows data exception management is no longer a vanilla data problem.

Data exceptions are usually fixed at a functional level rather than across all segments of the trade lifecycle and this means the same exception is likely to be ‘resolved’ multiple times across multiple departments, incurring more cost and more manual intervention. This repeated fixing of the same issue is likely to be a hidden; a baked-in cost across the business.

Functions are still being pressured from on high to reduce costs and with previous initiatives already having played out, there is little left to shave off. This is where the survey throws up an extraordinary data point. By investing in data exception management, more than a third of respondents would expect to reduce costs by up to 20% across the trade lifecycle, and close to two thirds expected to save between 2 and 20% per transaction.

While this number reflects automation across all exception types, data sets and business functions, it remains clear that automation of data exception management promises big potential savings.

Time to act

All the above demonstrates the pervasive impact of data exceptions and the pressing need for a firm data quality foundation. Regulatory reporting is far from exempt and, indeed, the impact seems to be becoming a higher priority.