Skip to main content

Understanding the EU's Approach to Disinformation: A Breakdown of the Code of Practice and Auditor expectations

In our previous blog post, we explored the key steps Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) should take to prepare for their upcoming audits under the Digital Services Act (DSA) of the EU Code of Practice on Disinformation (COPD). However, the COPD's 44 commitments, encompassing hundreds of measures, qualitative reporting elements (QREs), and service-level indicators (SLIs), can be overwhelming to navigate.

In this blog we break down the COPD into its six key thematic areas, providing a further understanding of what VLOPs and VLOSEs need to prioritise for compliance. Additionally, this guide outlines the areas auditors may focus their attention regarding the controls that platforms should have in place to meet these commitments and measures. 

  1. Scrutiny of Ad Placements: The COPD emphasises the need to prevent disinformation by enhancing the scrutiny of ad placements. This involves avoiding the placement of advertisements next to disinformation, tightening eligibility requirements for monetisation programs, and providing transparency to ad buyers on ad placement. Auditors will likely focus on the robustness of advertising review processes, both automated and human, and the effectiveness of real-time monitoring controls for disinformation in advertisements. They may also assess the clarity and enforcement of policies for political advertising transparency, as well as mechanisms to identify and demonetise repeat offenders.
  2. Political Advertising: This section of the COPD aims to enhance the transparency of political advertising by requiring clear labelling, sponsor identification, and ad repositories. It also emphasises the requirement to define “political and issue advertising” and thereafter verify political advertisers. Auditors may focus on controls supporting political advertisement labelling, sponsor verification processes, and the accessibility and completeness of advertising repositories. They may also examine how platforms ensure they provide transparency to users about why they are seeing specific political advertisements.
  3. Integrity of Services: Maintaining the integrity of online services is a key focus of the COPD. This involves combating manipulative behaviours and practices such as fake accounts, bot-driven amplification, impersonation, malicious deepfakes, and coordinated inauthentic behaviour. Auditors may focus on the strength of identity verification controls, the sophistication of bot detection and mitigation measures, and the effectiveness of anomaly detection tools for identifying coordinated inauthentic behaviour. Auditors will also likely assess the enforcement of policies and procedures against manipulation attempts and related behaviours.
  4. Empowering Users: Empowering users to identify and report disinformation is another central theme in the COPD. This includes promoting media literacy, adopting safe design practices, providing tools to assess content authenticity, and facilitating user feedback relating to disinformation. Auditors will consider the approach to assessing the effectiveness of media literacy initiatives and the useability of tools in place for content verification.  Other areas of focus may include assessing the controls in place to ensure the safe design of recommender systems and the responsiveness of platforms to user reports of disinformation.
  5. Empowering the Research Community: This section focuses on providing researchers with access to platform data to study disinformation. It also emphasises cooperation with researchers including the development and funding of an independent, third-party body that can vet researchers and research proposals. Auditors may review data access controls relating to researchers to assess whether they are secure and private. They may also assess the transparency of data sharing agreements and the governance process over sharing information with the research community.
  6. Empowering the Fact-Checking Community: This section aims to strengthen cooperation with fact-checkers by providing them with resources, support, and access to relevant information. It also emphasises integrating fact-checking into platform services and processes. Auditors may focus on the nature and extent of cooperation with fact-checkers, including the number and diversity of organisations and the extent to which fact-checkers across Member States and languages are considered. They may also assess the transparency of fact-checkers' methodologies and the controls in place to ensure adherence to the International Fact-Checking Network (IFCN) Code of Principles. Additionally, auditors may examine how platforms provide fact-checkers with access to relevant data and tools to enhance their work.
     

Conclusion
 

The EU Code of Practice on Disinformation presents a comprehensive framework for combating disinformation online. By understanding these six key thematic areas and considering the potential areas of auditor focus, VLOPs and VLOSEs can proactively prepare for compliance with the DSA. While this guide provides a summarised view, platforms must delve deeper into the specific measures, QREs, and SLIs outlined in the COPD to ensure full compliance and avoid potential sanctions. Thorough preparation and implementation of comprehensive controls are crucial for navigating the complexities of the DSA and COPD.

Deloitte supports several providers of very large online platforms and search engines with their readiness for DSA audit and establishing sustainable compliance, while forming our approach to audits of COPD. For more information, please get in touch.