There are four broad categories of company to whom the DSA may apply, namely VLOPs and VLOSEs, online platforms, hosting services and intermediary services, as shown in Figure 1 below:
Figure 1: Providers within scope of the DSA
VLOPs and VLOSEs
The European Commission designated the first 17 VLOPs and 2 VLOSEs (those with over 45 million users in the EU) in April 2023, with the applicable rules coming into effect in August 2023. With three more VLOPs being designated in December 2023, a total of 22 VLOPs and VLOSEs have been designated at the time of writing. There are a number of DSA requirements, for example in relation to systemic risk assessments, vetted researcher access and independent audits, that apply solely to VLOPS and VLOSEs.
Online Platforms, Hosting Services & Intermediary services
The primary focus of this blog is on the broader set of DSA rules that apply to both VLOPs, VLOSEs and (in terms of size) the next tiers of online platforms (such as online marketplaces and social media platforms), hosting services (such as cloud and web hosting services) & intermediary services (such as internet access providers and domain name registrars).
To be regulated, these companies must have over 50 employees and an annual turnover which exceeds EUR 10million in the EU. There are a number of different regulatory obligations that now apply to these companies as of 17 February. For the purposes of this blog we focus on content moderation & associated transparency measures and cooperation with trusted flaggers. We consider the experiences of VLOPs and VLOSEs in these areas to be equally applicable to the next tiers of regulated entities.
Content moderation & associated transparency measures
Broadly speaking, content moderation refers to the process of removing or reducing the visibility of potentially harmful online content. A stylised content moderation process is shown in Figure 2 below.
Figure 2 - A stylised content moderation process (source: Ofcom)
The key steps of this stylised process set out by Ofcom are the following:
a) Content posting / upload: a user submits a content item (e.g. video or a piece of text).
b) Database matching: automated systems compare the uploaded content against databases of known violative content and, if a match is found, prevent publication.
c) Publication: content that passes the matching stage is made available to other users, typically seconds after posting.
d) Detection of potentially violative content: this may arise from flagging by AI-based ‘classifiers’ identifying potentially violative items, or reports by end-users or third-party organisations.
e) Review and removal: Content flagged by classifiers and/or reported by users as being potentially violative is sent to human moderators for review (unless content is removed automatically, in which case human review may or may not take place).
f) Appeals and restorations: Typically users whose content is removed can submit an appeal. Decisions not to remove an item may also be appealable.
The DSA requires a number of key actors in the online ecosystem to address illegal content, setting out obligations to establish mechanisms that allow users to flag illegal content online. In so doing, the DSA requires that providers of online platforms shall, without undue delay, submit to the European Commission the decisions and the statements of reasons for the removal of illegal content, for inclusion in a database managed by the Commission. At the time of writing1, 4,680,151,406 statements of reasons have been submitted, across 16 active platforms, with ‘scope of platform service’ (i.e. content that violates the platform’s terms of service)], ‘Illegal or harmful speech’ and ‘Unsafe and/or illegal products’ being the most reported violations, and 73% of the decisions fully automated.
This process requires complex requirements to be shared across internal teams, such as engineers, trust & safety, legal and compliance. Technical implementation is also required, which (depending on the likely volume of data to be shared) may require API configuration to ensure effective and timely interoperation with the European Commission’s Transparency Database. It is expected that the initial experiences will provide scope for further learning, as a significant volume of submissions were received during the first period of DSA application. The responsibility is on platforms within scope to register their obligations under the DSA, which then leads to onboarding via the relevant DSC, who will provide access to a sandbox environment to test submissions.2
Cooperation with trusted flaggers
The DSA places significant emphasis on companies within scope working with other players in the online ecosystem to ensure a safe online environment. One key player is a trusted flagger, which is a designated entity (such as an NGO) with demonstrable expertise and competence to report illegal content to which platforms will have to react with priority. Following broader application of the DSA on 17 February, trusted flaggers will now be designated by the DSC relevant to the Member State in which they operate.
From a technical perspective, online companies within scope will be required to validate the credentials of trusted flaggers, so their reports can be prioritised within the platform’s reporting channel. There are also process considerations in the DSA in relation to how a trusted flagger should be treated if they place a ‘significant number’ of ‘insufficiently precise, inaccurate or inadequately substantiated’ notices for content removal. This requires the online platform to share this information with the relevant DSC, who may open an investigation which could ultimately lead to the entity’s trusted flagger status being revoked.
We expect interaction with trusted flaggers to be an area that grows in importance in the coming year now that the DSA is fully applicable. Indeed, the DSA provides for potential regulatory guidance to be provided in this area, which would no doubt be beneficial once a body of emerging practice has been established. In the UK, Ofcom has recently confirmed that it will continue to consider the use of trusted flagger arrangements through dedicated reporting channels (DRCs) across all kinds of illegal harm.
1. Applying the Three Lines of Defence (‘3LOD’) Model
The 3LOD model is agnostic to the underlying subject area and its principles are equally applicable to ensure the robustness of content moderation processes.
In the 3LOD model:
Each line has particular roles. In relation to content moderation, for example: senior management would establish an internal content moderation control system to assure that key controls are operating effectively; risk management and compliance would prepare ongoing risk reporting; and internal audit would provide assurance on the overall effectiveness of the content moderation control system. The 3LOD governance model provides an excellent frame of reference for companies within scope of the DSA to review their content moderation activities.
2. Define roles and responsibilities for regulatory compliance, starting with leaders, and bring everyone along on the journey
In complying with these requirements, it is important to define the role of leadership in reinforcing culture change through setting the right tone and role modelling new behaviours and ways of working. The role of technical product and technology leaders is particularly critical in articulating the value of early collaboration with compliance teams in order to streamline product development. It is also vital to establish an internal communications plan that provides structured, consistent touch points, and ensures a two-way dialogue between compliance and business teams to allow for accurate upwards reporting. Training needs assessments should be conducted, considering both the technical subject matter within the specific regulation(s), alongside foundational skills and understanding of regulatory compliance.
3. Lean on organisational values
Content regulation, like the DSA and UK Online Safety Act, formally requires organisations to navigate fundamental considerations relevant to topics such as freedom of speech and the democratic process. Leaning on organisational values will be critical to inform decision making. Similarly aligning a company response to company values can help articulate the value of remediation activities to all staff. This will help frame compliance as a natural progression in a company’s journey to becoming a more responsible business, rather than an administrative burden being imposed on technical staff.
4. Be mindful of the thresholds
Larger companies should also closely monitor their own proximity to the thresholds for notifying the Commission as a VLOP or VLOSE, given obligations in the DSA for such companies to publish information on user numbers every six months, and to respond to DSC and European Commission requests for information in this respect.
Background to the new DSC regime
The new DSA regime sets out a critical role for DSCs (who will also form a new European Board for Digital Services), in addition to the new central supervisory role played by the European Commission in respect of VLOPs and VLOSEs. Deloitte has recently published a cross-jurisdictional overview of DSA implementation in nine Member States which provides a number of insights on DSC designation in the countries concerned.
The European Commission maintains a list of confirmed DSCs. At the time of writing, it appears that not all Member States have formally confirmed their DSCs by the required deadline. These authorities are in the most part existing regulatory bodies whose remit has been expanded to cover new obligations under the DSA. However, in some cases new regulatory bodies have also been established. We provide two Member State DSC case studies (across both of these categories) below, focusing on emerging operational implications of this supervisory regime relevant to the scope of this blog. We then identify initial strategic implications for companies that will arise as a result.
Autoriteit Consument en Markt (‘ACM’), Netherlands
The ACM, already responsible for ensuring fair competition between businesses and protecting consumer interests in the Netherlands, is the DSC with responsibility for ensuring compliance under the DSA. Prior to the application of the DSA in the Netherlands, the ACM published a consultation setting out its proposed approach on topics such as dealing with illegal content, the protection of minors and the prevention of dark patterns.
More broadly, the Chairman of the ACM has recently3 set out his intention to establish a regulatory dialogue with companies and also other stakeholders relevant to the DSA (such as trusted flaggers and vetted researchers), articulating an ambition to develop a new supervisory approach to regulation. Relevant to the topics highlighted above, the ACM Chairman highlighted its role in relation to the certification of trusted flaggers in the Netherlands (highlighting potential certification of around 15 trusted flaggers in the first year) and that it aims to have a 50-person team in place this year to handle DSA-based complaints, which it expects to significantly increase.
The Coimisiún na Meán (‘CNaM’), Ireland
In Ireland, the CNaM, Ireland’s recently established media commission, is responsible for ensuring a safe online environment which includes its role as DSC under the DSA.
The CNaM recently closed its consultation on a draft Code on Online Safety relevant to video sharing platforms, which is an important part of Ireland’s broader online safety regulatory framework. Amongst other things, the draft code highlighted that video-sharing platform service providers should reasonably prioritise notifications they receive from trusted flaggers, by integrating these with mechanisms provided for notifying content that is otherwise in breach of the terms and conditions of the service, and the mechanism for notifying content which is illegal under the DSA.
The CNaM has confirmed that on 19 February it opened a contact centre to provide advice to users on their rights under the Digital Services Act and gather intelligence that will inform the CNaM’s supervisory and enforcement activities. The CNaM also emphasised that when people spot illegal content, they should flag it, so the platform can stop it.
1. Prepare for a new supervisory relationship with the designated DSC
A new relationship with the applicable national DSC will need to be cultivated. As is clear from public statements made by DSC leads to date, this is expected to be an ongoing, supervisory relationship. Companies at Member State level should prepare for this dialogue, looking out for updates and guidance from their designated DSC.
2. Put in place the necessary risk control frameworks to demonstrate compliance with these new requirements
There are a number of differences in terms of the requirements that apply to the largest EU companies within scope of the DSA (e.g., independent audit, access to data for researchers) and the rest of the market. Even so, we consider it consistent with established good practice for the next tiers of regulated companies to put in place appropriate risk control frameworks to ensure and demonstrate compliance with applicable requirements relevant to their specific activities. Such processes can then be shared in the event of an information request by the relevant DSC.
3. Retain an agile approach so that processes can be updated in light of experience
A number of these new requirements – such as the approach highlighted above in relation to the appropriate grounds for rejecting a notice from a trusted flagger – will likely benefit from further guidance as the regime matures. It is important for affected companies to keep their processes under review so that they can be kept up to date.
This is a landmark, new regulatory regime. As can be seen by the existing DSC gaps in certain Member States, some parts of it are still being put together. That said, we expect significant progress to be made on implementation during the year, in particular in relation to the execution of the new tasks (at both European Commission and DSC level) and cooperation with the third-party bodies, such as trusted flaggers, that are an integral part of the compliance framework.
Companies within scope should ensure they are already putting in place the processes to ensure compliance with the new regime, by introducing control frameworks and implementing the required technical configurations as required. Broadly speaking, the newly appointed DSCs are expected to develop a new, ongoing ‘supervisory’ regime in respect of the many online companies now within scope. Affected companies should bear this in mind as they develop their compliance strategies going forward.
____________________________________________________________
1 Data from Home - DSA Transparency Database (europa.eu) as viewed on 20 February 2024.
2 On a related point, a Delegated Act is expected in Q1 2024 on how affected companies should publish a transparency report on the content moderation in which they engage.
3 Keynote by Martijn Snoep, Chairman, The Netherlands Authority for Consumers and Markets (ACM), CERRE Digital Platforms Summit, 17 January 2024.
Simone PelkmansPartner, Deloitte NetherlandsColm McDonnellPartner, Deloitte Ireland
Robert is a Director in Deloitte's EMEA Centre for Regulatory Strategy, where he leads the Centre’s work on regulation in Digital Markets. Prior to joining Deloitte, Robert spent eleven years at Vodafone Group, setting Group policy positions across a wide variety of regulatory initiatives relevant to the promotion of competition and protection of consumers in digital markets. Robert has over a decade's experience working at regulatory bodies relevant to the sector, spending eight years at Ofcom (and its predecessor Oftel) and four years at the UK's competition and consumer protection authority. This included a secondment to the US Federal Trade Commission working on technology topics in the FTC's Bureau of Consumer Protection.
Suchitra is a Partner in the EMEA Centre for Regulatory Strategy and helps our clients to navigate the regulatory landscape around technological innovation. She sits on the UK Fintech Executive and leads our thought leadership on topics such as digitsation, cryptoassets, AI, regulatory sandboxes, Suptech, payment innovation and the future of regulation. She recently completed a secondment at the Bank of England, supervising digital challenger banks. Suchitra is a member of various industry working groups on innovation in financial services and has regularly featured in the Top 150 Women in Fintech Powerlist (Innovate Finance). She is a qualified Chartered Accountant and has previously worked in Deloitte’s Audit, Corporate Finance and Risk Advisory teams, where she led large-scale regulatory change projects.