Skip to main content

Algorithms and the CMA

A call for action

In recent years, firms have increasingly adopted and embedded algorithms into their operating models. This innovation has allowed companies to differentiate themselves in the market and improve the efficiency with which they provide goods and services to customers. Algorithms are now responsible for a whole host of applications, from targeting consumers with bespoke advertisements, through to product recommendations and filtering what content we see online (to name just a few). With these algorithms capable of processing vastly more data than before, and the ability to adapt to each unique interaction of an individual customer, the prospects for businesses to capitalise and take a competitive advantage in the market have never been greater.

Although there has been a shift in types of platforms on which businesses and consumers interact with one another, the Competition and Markets Authority (‘CMA’) has pointed out that a digitalisation of the marketplace does not diminish the presence of anticompetitive behaviours of market participants, nor the risk of harm to consumers. In recognising the relatively new appearance of algorithmic systems in business operation practices, provisions now need to be included in current and future legislation to reflect and govern the particular risks posed by algorithms to consumers and businesses alike.

In its most recent review of the impact of algorithms on fair market operation, the CMA has identified several potential harms to competition and consumers as a result of a misuse or malfunction of algorithmic systems. The CMA highlights discrimination and targeted personalisation as sources of particular harm to consumers, whilst exclusionary practices and the chance for algorithms to learn to collude on their own are identified as a major concern for the integrity of markets. It has always been the role of the CMA to ensure market competitiveness and protect consumers from harm, however the new challenges posed by algorithmic systems calls for a review and update of regulation in a digital marketplace.

This blog summarises the findings of the CMA on the impact of algorithms on market integrity and competitiveness – and then sets out what this means for your firm.

The case for intervention and the role of regulators

In all areas identified by the CMA as potential risk factors, a lack of transparency in the function and purpose of the algorithm with the users and marketplace is deemed the greatest cause for concern. A particular benefit of algorithmic systems to businesses is that they are able to collect vast quantities of data from each user interaction, often without the user’s full awareness. Contextual data such as user location, search history, network, and even battery health are all assimilated for the purpose of tailoring the user experience to each individual customer. However, the CMA has recognised that, at times, there is a fine line between enhanced user experience, and overall harm to consumers as a result of opaque use of data.

The CMA has raised further concerns over the ability for businesses to use complex algorithmic systems to facilitate anticompetitive practices across digital platforms, which if attempted in the traditional bricks-and-mortar marketplace would be more easily identifiable. High-speed price comparisons paired with individually targeted price discrimination by algorithms allow businesses to retain market share whilst side-lining competitors. Whilst consumers may feel a direct benefit in the first instance of lower prices, they are indirectly harmed by a reduction in the overall competitiveness of market choice. The current regulatory framework at present requires updates in order to continue to mitigate these risks.

In fact, the capitalisation by these algorithms of Machine Learning (‘ML’) to ‘understand’ consumer’s behaviour potentially poses the biggest threat of all. Algorithms that learn from interactions with consumers and other businesses in the digital marketplace, rather than manual oversight and input, could evolve far beyond their original design or intention. These systems, although initially programmed for a particular use, may learn to operate to the detriment of all market participants over time and without the awareness of the firm to prevent this.

What is the CMA proposing?

The CMA is calling for greater transparency in all areas where algorithms either enable or dictate business or consumer interactions. As achieved through the disclosure of financial statements, there is a strong argument that similar external auditing of algorithmic practices would help stakeholders understand how algorithms function. Therefore, the CMA is proposing that, in some cases, algorithms may need to be audited by expert 3rd parties or regulators.

Our experience, borne out of many years of work in financial services, points to five main ‘pillars’ upon which regulators often look to build their regulations related to algorithms; namely:

  • Governance
  • Testing & Development
  • Algorithm Controls
  • Monitoring
  • Documentation

These five broad areas reflect the different aspects and requirements of algorithm regulation. Under proposals outlined in its paper, the CMA has identified Monitoring and Documentation as two of these key areas and this is where greater regulation is most likely. Whilst suggestions for Governance-style changes are also made, the choice of the CMA to focus on Monitoring and Documentation reinforces the overall narrative that there is a greater need for transparency in the function of competitive markets.

Firms will be expected to conduct monitoring of their own algorithms within the marketplace - and to file self-assessments on the risks for harm posed by their activities. The CMA may be drawing inspiration from existing requirements under MiFID II here, hoping that regular self-assessments and other risk management controls will reinforce ownership of the risks that a business’s algorithmic systems pose.

Furthermore, from April 2021 the CMA will establish the Digital Markets Unit (‘DMU’) as a specific vehicle through which they will attempt to monitor and regulate the digital marketplace, concluding that the role of this unit is necessary for the good governance and effective pro-competitive function of those marketplaces where algorithmic systems are a key operator.

What this means for your firm

With the focus from the CMA very much towards increasing the transparency of algorithm operation within the digital marketplace, we see this translating into greater ownership by firms for the actions of their algorithmic systems. The CMA has made it clear that simply being able to explain the intention of the algorithm is insufficient when Machine Learning techniques are capable of adapting to the unknown actions of other stakeholders. Firms will likely be required to demonstrate their understanding of the algorithmic system, detailing this within self-assessments or at the demand of the regulator, even for more complex ‘black-box’ systems. Firms may also be asked to develop algorithm codes of conduct, document the intention of algorithmic systems as well as defining parameters for acceptable interaction with consumers.

Many firms may not have provided much thought to managing business risks posed by their algorithms beyond high level business continuity issues. Regulatory intervention from the CMA has now put an onus on firms to start paying attention to their algorithm risks – and how they manage them.

This is a complex area that will require much work for firms. However, we believe there are certain ‘no regret’ steps that firms should be taking to start managing algorithm risks. In particular, firms can define and identify their algorithms within an algorithm inventory – a tool which stores information about algorithms including; who their owners are, how they work, data input types including whether or not they use any personal data, and when they were last updated etc. Armed with this information, it is then possible to define an approach to start risk-rating algorithms and consider some of the potential harms that management and regulators are concerned by – a potentially significant task. This algorithm risk assessment can be used to identify existing controls, which may be leveraged or re-designed to mitigate risks of concern, and identify opportunities to establish new governance arrangements, controls or processes. As part of this work, firms should consider their current processes for the testing of algorithms both on a pre-deployment and post-deployment basis (also capturing changes to those algorithms, which are already live). Firms should establish policies and procedures to govern this testing across a range of risk types, as well as define a broader documentation set with the aim of building greater transparency to the process, demonstrating how firms have considered and responded to the potential impacts of their algorithms on customers and markets.

Conclusion and next steps

The approach that is being considered by the CMA indicates an industry-wide need for regulatory strengthening in the presence of algorithmic systems. The application of current regulatory frameworks to such systems is viewed as an insufficient response to not only the benefits of, but also the challenges resulting from, a digitalised marketplace. With the introduction of the DMU, to be conferred the power to ‘suspend, block and reverse’ anticompetitive and consumer-harming behaviours of algorithmic systems, this certainly indicates a proactive regulatory intention of the CMA for the digital marketplace. Our team are happy to have a conversation with you and your firm on the CMA proposals. For more information, take a look at our Algorithm Assurance Insights Pack and see below on how to get in touch with our experts.

Sign up for the latest updates