For over a decade, regulators have consistently highlighted the remediation of data-related issues as a top priority. This intensified after a 2023 BCBS progress report, prompting re-emphasis on BCBS 239 risk data standards. The PRA has consistently cited data risk and regulatory reporting quality as one of its top priorities since 2021, reflecting a persistent industry challenge.
Banks have spent a lot of time (and money) on fixing data issues, yet costly initiatives often yield limited progress, in turn leading to intensifying supervisory scrutiny.
With supervisors expecting boards to take ownership of data-related issues, NEDs will need to show familiarity with their bank’s data strategy, challenges and progress to their supervisors. NEDs can also play a critical role in setting the tone from the top of the organisation and challenging management.
For NEDs to provide effective challenge and oversight on remediation of data-related issues, they need to have a strong understanding of regulatory expectations and programme objectives related to data remediation and transformation. Our view is that the foundations of a more effective data transformation programme are 1) performing a robust current state assessment; 2) defining an appropriate and sustainable data operating model; and 3) educating the business.
Banks often launch large-scale data remediation programmes without fully understanding or internalising the root causes of underlying problems. An honest, evidence-based assessment of the current data landscape is the foundation for everything that follows, including targeted and effective implementation.
Banks often set a broad transformation vision, trying to fix every data issue at once, but a more effective approach is to be more targeted – focusing on areas where improvements will deliver the greatest business benefit.
A more targeted approach therefore requires identification of the critical processes that rely on data (e.g., regulatory, financial, management reporting) and their ‘critical data elements' (CDEs).
In our experience, there are varying definitions of ‘critical’ across the banking industry. But as a general principle, data is likely to be critical if it feeds important external outputs, directly informs board-level decisions, or is frequently used. Examples include data feeding directly into regulatory metrics (such as PD, EAD, LGD), reports that support the bank’s regulatory responsibilities (e.g. transaction reporting for market abuse monitoring) or market transparency (e.g. EMIR). Data underpinning key risk indicators included in the bank’s risk appetite framework are also likely to be critical.
Understanding the 'lifecycle' or 'lineage' of these CDEs – their origin, transformation, and use – is vital. This uncovers inefficiencies, duplication, unclear ownership, or lack of a single, authoritative 'golden source' with proper controls.
NEDs, particularly in Board Audit and Risk Committees, should challenge management on the rigour and comprehensiveness of the bank's approach to these assessments, rather than the granular details themselves. Their focus should be on ensuring the methodology is sound and the findings are actionable.
A thorough current state assessment reveals existing data governance strengths and weaknesses, informing the allocation of clear organisational responsibilities and accountability.
While the most appropriate governance model will vary from firm to firm, at a minimum supervisors expect banks to have in place a central data governance function. This is increasingly pushing banks towards either a centralised model (where a single, enterprise-wide function manages all critical risk data) or a hybrid model (where a central function sets standards, taxonomies, and governance; and business units execute data quality standards).
Centralised and hybrid models both have benefits and drawbacks. Centralised models have benefits in terms of maintaining consistency and data quality, and simplifying lineage and audit processes, but they can be expensive to implement (given fragmented legacy systems) and create bottlenecks. Hybrid models may be easier to scale across complex businesses, but may not achieve the same degree of consistency as a centralised model, and require strong governance to ensure that accountability for data is clear.
Alongside the right governance model, banks need to establish a strong enterprise-wide data quality framework to bring the data operating model to life. This is the mechanism that ensures that data is accurate, complete and reliable across the organisation. A strong framework has several features: it defines principles and minimum expectations for data quality; it assigns clear ownership and accountabilities for data domains and CDEs; it sets out the processes and controls to manage data quality through the lifecycle; and it embeds monitoring, escalation and remediation processes.
Whichever governance model banks pursue, supervisors expect strong Board and senior management oversight. The ECB, for example, introduced new requirements in 2023 for SSM banks to submit a Management Report on Data Governance and Data Quality, signed off by at least one member of the management body who has overall responsibility for data governance and submission of data to the supervisor (often, in practice, the CRO and/or the CFO).
Supervisors also have tools at their disposal to hold executives individually accountable for collective failures – through the UK SMCR, for example, or the new accountability regime introduced in the EU through CRD6. The existence of these emerging supervisory expectations and enforcement powers make it even more important for NEDs to provide effective challenge and oversight.
Banks need to ensure that, across the business, there is alignment on what good data management looks like, the standards that people should adhere to, how to source new data and how to interact with data management functions.
But equally important is to tell a compelling story about the reasons for the significant investment that improvements to a firm’s data capabilities will require, effectively conveying both the ‘carrot’ and the ‘stick’.
Failing to address data issues has known supervisory consequences. The PRA has regularly used s.166 reviews in recent years. The ECB has been clear that it is willing to impose Periodic Penalty Payments (PPPs) on banks, and will receive powers through CRD6 to impose PPPs on individual executives for failing to discharge their responsibilities. Banks also face direct capital consequences, such as higher capital add-ons (e.g., Risk Management and Governance capital scalar in the UK, or worsening SREP scores and higher Pillar 2 Requirements in the EU).
Some banks have also started to take internal measures to incentivise improvements to data quality – for example, imposing internal capital charges on businesses or imposing restrictions on business for continued data quality issues, and including data-related metrics in certain employees’ objectives.
But banks also need to tell a positive story internally about the potential business benefits of investment in data.
This includes improved profitability: improving income by enabling banks to price risk more efficiently and make better decisions on which customers or segments to target, and reducing the costs associated with manual reconciliation and remediation..
It can also allow banks to improve return on equity through more optimised financial resource consumption. For example, better data on risk parameters can reduce the need for IRB banks to apply a 'margin of conservatism' (MoC) to model outputs, mitigating data quality-related model risks and reducing capital consumption. Improved data can also enhance intrinsic accuracy of risk parameters (PD, LGD), potentially reducing Risk-Weighted Assets (RWAs).
On liquidity, some banks have had success in developing new metrics to understand how they are allocating liquidity across the group – identifying which business lines are natural ‘consumers’ of liquidity, and which business lines are natural ‘suppliers’, and then using that information to optimise intra-group liquidity transfer and use of external liquidity facilities. Developing the right metrics for this kind of exercise is only possible if the underlying data is readily available and capable of being fed into the metric.
Furthermore, in a volatile geopolitical and macroeconomic climate, banks with reliable, timely Management Information (MI) improve decision-making speed and quality, and better adhere to risk limits. During market turbulence, banks that spend less time reconciling data between teams will be able to provide management with a clear view of their exposure, and will have a better chance of early mover advantage (and a reduced risk of late mover disadvantage).
Supervisors have consistently identified the benefits of improved risk data aggregation. For example, the ECB’s recent targeted review of banks’ lending to SMEs highlighted the correlation between banks’ ability to identify and manage emerging stress in their loan books and the robustness of IT infrastructure, the level of automation they have been able to achieve and their data collection capabilities.
NEDs are well placed to play a role in changing perspectives internally.
Ultimately, for bank Boards to have confidence in the data-driven decisions that they are being asked to take, they need to have well-founded confidence in the quality of the underlying data.
NEDs can exert a positive influence by asking the right questions, but this is only effective if they also possess the necessary skills and knowledge to evaluate the answers they receive.
While some NEDs will have a strong knowledge of data-related issues, others will need to upskill in certain areas. As noted above, areas may include the evolving regulatory landscape around data, the fundamentals of data governance and management, and the strategic value and business impact of data. In addition, having at least a conceptual understanding of some of the technology and data architecture issues that the bank faces (for example, understanding how fragmented, legacy systems hinder data quality) will be invaluable.
The imperative for banks to improve their data capabilities only gets stronger as supervisory pressure intensifies, the risk environment becomes more complex, and the competitive benefits of using data-reliant new technologies become more apparent. NEDs that proactively engage on this topic can help to catalyse both improved alignment with supervisory expectations and medium-term strategic benefits.
Providing constructive review and challenge in the areas explored in this blog is a good starting point for NEDs looking to move the data dial.