Skip to main content

Technology and Digital

Over the past few years, firms have been seeking new methods of creating value and actively leveraging digital, disruptive technologies and operating models. The digital era is complex, fast changing and brings a wealth of opportunities; however, operating in this digital world also presents new challenges. Digital technologies bring opportunities to increase efficiency, quality, customer experience and ultimately growth. However, they also bring a new set of risks, and unlike traditional risks these emerge quickly, differ across technologies and can be hard to identify and control using traditional approaches.

IA can add significant value through involvement at the design stage to ensure benefits, such as cost reduction, improvement in customer experience, new revenue generation and regulatory compliance, are realised. Taking into account the costly technology and digital issues that have arisen across financial services, IA also has a role to ensure the associated risks are properly understood and managed.

Our view of the change in IA focus from prior year to now:

Overview

Given the high-profile cyber incidents across financial services in recent years, coupled with the continuing regulatory interest in this area, we anticipate that demands and requirements for firms around cyber and cyber resilience will only increase.

The Bank of England, PRA and FCA issued a joint Discussion Paper in 2018, returning the spotlight to operational and cyber resilience. The focus is on the development of a broader framework for firms, enhancing resilience stress testing and establishing strong impact tolerances and performance metrics.

Financial services firms will be expected to set their own resilience tolerances (maximum downtime for instance), in line with a resilience baseline and taking into account inter-connectedness with other financial services firms.

IA's role

IA should apply a risk-lens to the cyber agenda, taking account of regulatory, senior management and board demands on assurance and challenge.

Areas of IA focus should include:

•Governance and senior manager accountability for cyber resilience.

•The extent to which a holistic, enterprise-wide approach to cyber and operational resilience is employed, that can influence a ‘resilience’ culture in the firm and drive ‘resilience-by-design’, particularly when it comes to the technology environment.

•The quality of data-based metrics to monitor disruption, performance against key indicators and tolerances by system/component.

Overview

Disruptive technology and the era of digitalisation are here to stay. Technological advances and trends in advanced analytics, robotic process automation (RPA) and cognitive intelligence, including Artificial Intelligence (AI), are rapidly transforming firms. They are reshaping business models and enable innovation in terms of productivity and operational efficiency but also, critically, in the way they connect with, and offer products and services to, their customers.

Limited availability of the right quality and quantity of data, insufficient understanding of AI’s inherent risks, a firm’s culture and regulation can all act as barriers to widespread adoption of AI in firms.

Although the regulation in this area remains in its infancy, there have been some recent developments. The EU Commission published the steps it is taking for “building trust in Artificial Intelligence”, which includes the publication of the final Ethics Guidelines for Trustworthy AI, aiming to encourage public and private investment in AI and related technologies, whilst managing the risks.

IA's role

The EU Commission published the steps it is taking for “building trust in Artificial Intelligence”, which includes the publication of the final Ethics Guidelines for Trustworthy AI, aiming to encourage public and private investment in AI and related technologies, whilst managing the risks.IA's roleThe EU Commission’s announcements highlight that the ethical dimension relating to disruptive, new technologies is increasingly becoming a priority and will need to become an integral part of firms’ development.

IA’s role is two-fold. First is to ensure appropriate controls are being implemented to prevent and detect new and emerging risks. Second is to challenge how the key requirements are included within the Commission’s announcement for trustworthy AI are integrated in both the solutions their firms are deploying, as well as those that are already in use.

Specifically, IA’s role should include consideration as to whether relevant policies, structures and frameworks are updated to incorporate AI-specific considerations, for example firms’ Risk Management Frameworks (RMF), Risk Appetite Frameworks (RAF) and governance structures. IA’s role could also include the review of input data, underlying algorithms, the use of output and checking whether firms’ proposed AI strategies are aligned to business objectives.

Overview

Blockchain is a digital, decentralised, distributed ledger which secures (via encryption) and tracks digital transactions. Benefits include automation, transaction simplification, enhanced transparency in core business functions (supply chain management, back-office operations and compliance for instance) and help in the reduction of fraud. Blockchain remains in its infancy, but firms are increasingly testing its usage and its capability to support business change.

Risks from the corporate use of Blockchain technologies can be:

•Technology and operational risks – similar to those associated with current business processes but with additional nuances and different impact or velocity when they materialise. For example, due to inappropriate design architecture decisions, Blockchain solutions may not be sufficiently distributed or scalable to meet the long-term business requirements. In addition, the loss, damage or access of a malicious actor to a user's private key may result in irreversible loss of access to crypto assets.

•Value transfer risks – Blockchain enables peer-to-peer transfer of value (assets, identity or information) without the need for a central intermediary, thereby exposing the interacting parties to new risks that may have been previously managed by central intermediaries.

•Smart contract risks – Risks associated with encoding complex business, financial and legal arrangements on the Blockchain. For example, due to the infancy of the technology, smart contracts may not be recognised as legally enforceable by courts of law due to lack of appropriate precedent.

IA's role

IA’s role will depend on the rate of adoption of Blockchain technology in their respective organisations. IA’s role should remain a value-add and impactful through understanding the specific implementations of the Blockchain technology, upskilling the team to truly understand the emerging and existing risks that the technology is susceptible to and staying ahead of the curve.

Its role as a trusted business adviser and in anticipating/evaluating newer risks to the organisation is once again key, as organisations continue to evaluate the use of the technology. Moreover, given that Blockchain is still being seen as a “black box” and intrinsically complex, associated with a high-risk technological development, IA should be able to ‘separate facts from fiction’, provide a clear objective and timely assessment of the risks posed by Blockchain and the governance and controls implemented to mitigate them.

Overview

Cloud adoption has seen a significant upward trajectory over the past few years, with the global public cloud market projected to grow 63% over the next three years. The transformational benefits, such as flexibility (pay-as-you-go), technological sophistication, cost and tax efficiency, customer empowerment, are undeniable and have fuelled the exponential rise in its popularity and successful adoption.

The risks, however, if not managed carefully can be significant. There have been some significant issues and failures that have hit the news recently. A glitch at a major cloud service provider (CSP) in 2017 caused hundreds of thousands of websites using its services to function badly or not at all for a few hours. Lloyds, the specialist insurer, estimated in a report published in January 2018 that if an extreme cyber incident took a top cloud provider offline for three to six days it would cost US businesses around $15 billion.

Despite the expansion and increased level of maturity in adoption, benefits and technology, using the cloud is not straightforward. Even though a large part of the IT function may be handed over to a cloud service provider (CSP) and with it, much of the operational burden, users still bear the ultimate risks and responsibility if things go wrong.

IA's role

While security at CSP level has noticeably matured and improved in recent years, a greater risk to security remains within the cloud user organisations’ control for example, in the better management of access rights, the level of discipline applied to monitoring changes in configuration, etc. As such, the role of IA within these organisations is fundamental, providing assurance over the effectiveness of these controls.

IA should consider the following in the scope of their review activities:

•Cloud governance and strategy.

•Cloud insider risk, communications and people retention post-cloud deployment.

•Complexity of technology integration with legacy platforms, deployment impact across the technology estate, project assurance over transformational or integration initiatives and security controls.

•Data privacy considerations, including the physical location of data as well as broader operational and compliance risks.

•Implications of GDPR and complying with other national laws.

Overview

The General Data Protection Regulation (GDPR) came into force in May 2018. The regulation carries significant regulatory fines for breaches, of up to 4% global annual turnover or €20 million – whichever is greater. Significant fines have been levied to date by the Information Commissioner’s Office (the UK supervisory authority).

IA's role

•IA should consider the following in their scope:

•Assess the implemented data protection policies, procedures and controls to comply with GDPR.

•Evaluate the effectiveness of the implemented framework and controls in response to the GDPR requirements.

•Deeper focus on high risk functions (e.g. incident management, marketing and any other functions handling large amount of personal data).

•Assess the accountability framework and data processing taking place abroad.

Did you find this useful?

Thanks for your feedback

If you would like to help improve Deloitte.com further, please complete a 3-minute survey