The Single Supervisory Mechanism (SSM) - The big data issue | Whitepaper
2014 is a year of change for banking supervision in Europe. The European Central Bank (ECB), national supervisory authorities and banks are busy preparing for the Single Supervisory Mechanism (SSM), which is scheduled to take responsibility for the prudential supervision of banks in the Eurozone from November 2014.
The current focus of preparations is rightly on getting ready for the start of the new supervisory regime. But if the SSM is ultimately to be a success, preparations need to begin now to tackle the strategic challenges that will become increasingly important in the medium-term.
We believe that data and analytics (the technological solutions and mathematical techniques that support supervisors in analysing data) should be priorities for strategic investment. They could be the differentiating factors that determine whether or not the SSM delivers on the aspirations that have been set for it: to enhance the quality of supervision and policy making, to identify and implement best practice, and to be ambitious and innovative. Operating on the basis of the status quo could quickly become unsustainable.
This paper is intended to stimulate a debate as to where the ECB in particular might go in tackling these issues over the next few years. It is produced on the basis of Deloitte’s experience helping many banks with a wide range of data challenges, and assisting banking authorities to develop their supervisory capabilities.
Responsibility for establishing the new benchmark for data and analytics standards naturally lies with the ECB and national supervisory authorities, but responsibility for delivering the new standards will be borne jointly by banks and supervisors. It is therefore crucial that banks in the SSM engage with supervisors in the process of developing the route map for data and analytics. While we discuss these in the paper with reference to the SSM, the precepts are also relevant to supervisors in other countries (not least because of the prospect of convergence in this area) and to firms that face demands for data from supervisors.
The best outcome will not simply be a re-engineering of technology and practices around the status quo, but could involve a fundamental rethink of the relationship between banks and supervisors. If innovation in data and analytics can be applied to the production, management and interrogation of data, it will enable the balance of supervisory time to shift further towards addressing problems rather than merely identifying them.
Investment in data and analytics could also help supervisors better handle issues stemming from the increased complexity in banking. For example, how much of supervisory stress testing should be done by banks rather than supervisors? How far will ‘modelling’ approaches to capital requirements continue as at present, or will they be supplemented by various floors derived from standard formulae? One could even envisage a position where supervisors take greater ownership of the calculation of risk-weighted assets (RWAs) themselves, for example by gaining direct access to banks’ systems to source ‘raw’ data to double-check calculations, rather than relying solely on banks. In the process, they could address concerns about opacity and complexity, as well as facilitate supervisory challenge. All these options would reflect changes in the costs and limitations of technology and data that in the past have prevented supervisors from using their own data-intensive methods.
Page Last Updated