Skip to main content

Is your privacy management still loading?

Securing children’s data privacy in the gaming industry

Protect gaming adventures from risk

With the rapid integration of virtual and augmented reality, advanced connectivity, and artificial intelligence (AI) into gaming experiences, gaming is becoming more immersive and mobile, attracting a growing and diverse consumer base who expect a reliable, safe, and secure gaming experience with zero latency. Free-to-play and subscription games, indie games, esports, streaming, mobile gaming, augmented reality, and virtual reality are growing segments. The gaming industry is projected to reach a revenue of US$282.3 billion in 2024, and is expected to grow at an annual rate of 8.76% between 2024 and 2027, resulting in a projected market volume of US$363.2 billion by 2027.1

As such, gaming companies are having to innovate and expand features, modalities, experiences, and safeguards to meet consumer expectations. However, as consumer expectations grow, so does regulatory scrutiny. The enforcement of existing privacy regulations (e.g., Children’s Online Privacy Protection Act) has accelerated and new online safety regulations (e.g., Digital Services Act) are emerging across the globe. Growing consumer expectations and regulatory scrutiny are mandating that gaming companies proactively mature their compliance programs; embed data privacy and safety guardrails into the design of gaming experiences to keep consumers safe; and leverage emerging technologies, such as AI, to scale privacy and safety controls.

Is your privacy risk management playing by the rules?

Privacy and safety risks pose a disruptive challenge to gaming companies’ bottom line and ability to expand in certain markets. Significant fines have been issued to gaming companies for failing to obtain consent and improper data processing practices, especially with respect to processing children’s data. Violations are expected to be heavily penalized as demonstrated by the record setting $275 million fine—the largest ever imposed under the Children’s Online Privacy Protection Act (COPPA)—to one gaming company in 2023.² Failure to protect the most vulnerable user base could result in a damaging impact to gaming companies’ reputation and consumer trust.

As a result, implementing strategies that address data privacy and safety risks can potentially help a company protect its margins, capitalize on opportunities, and capture market share. For this reason, gaming companies should consider the following risks, which are likely to be a significant focus in the regulatory space:

Regulatory scrutiny on how companies are collecting and processing personal data is increasing, coupled with ongoing enforcement of the European Union’s General Data Protection Regulation (GDPR) and COPPA. This heightened scrutiny requires gaming companies to reexamine their data handling practices to identify and remediate instances of unauthorized data collection, failure to obtain parental consent, and noncompliant data processing. Gaming companies should establish a demonstrable legal basis for collecting and processing data from gamers. Within gaming, there are three practical legal bases: (1) performance of a contract, (2) legitimate interest, and (3) proper consent.

In addition to establishing a legitimate basis for the collection of personal data, the personal data should be processed in a lawful, fair, and transparent manner, and only processed for a specific, explicit, and legitimate purpose. This generally means that the personal data collected for gamers—adults or children—should be limited to the purpose for which it was collected. For gaming companies, this not only means having well-defined privacy, consent, and preference management capabilities, but also maintaining a data inventory that defines all the personal information collected and how it is used by the business. Limitations, controls, and policies should be established to restrict how internal company users can access and use personal data, as well as minimize any sharing with third parties. While these requirements are not specific to the gaming industry, these requirements raise unique challenges for gaming companies as many of their users are children.

Additionally, under COPPA, gaming companies are required to obtain verifiable parental consent from child users under the age of 13 before collecting personal information. To obtain verifiable consent, gaming companies should implement mechanisms that are reasonably designed to confirm that the person providing the consent is the child’s parent or guardian.³ There are several ways to collect verifiable consent. These include having a parent request and affirmatively respond to an email; completing an online form where they provide an identifier, credit card, or other details to verify their identity; mailing a consent form; or performing a knowledge-based authentication by asking parents details that only a parent or guardian would know.⁴ Furthermore, gaming companies should keep records of the verifiable consent obtained from parents and guardians to demonstrate accountability and compliance with COPPA requirements.

As there is a continued increase in regulations designed to protect minors online, child safety should continue to be a focus area for gaming companies. Two emerging regulations that should be top of mind for gaming companies are the California Age-Appropriate Design Code Act (CAADCA) and the Kids Online Safety Act (KOSA).

With the objective of enhancing minors’ online privacy and safety, the CAADCA establishes requirements for online services and platforms that are targeted toward minors under the age of 18.⁵ These requirements include, but are not limited to, implementing age-appropriate design features; clear and conspicuous notice to users; privacy settings; and limiting the collection, use, and retention of personal data from minors. Similarly, KOSA is intended to establish a safe online space for children through safeguards related to transparency, parental inclusion, and fostering a culture of responsible online practices. Like CAADCA, KOSA establishes requirements centered on age verification, parental controls, privacy notices and consent, online safety and privacy educational resources, data security, reporting mechanisms for inappropriate or harmful content, cyberbullying, and other safety considerations.⁶

In addition to the amount of time children are spending online, there are additional risks such as online predators targeting and grooming children to exploit them in various ways, including:

  • Online manipulation and coercion.
  • Cyberbullying, harassment, and intimidation.
  • Exposure to inappropriate or harmful content online such as violence, hate speech, or extreme ideas and radicalization.
  • Security risks such as scams, phishing, identity theft, account takeover and hijacking, deception, and fraud.

These risks may be further exacerbated by Generative AI (GenAI), which online game players may use with bad intent. For example, GenAI may be used to generate inappropriate content, produce deepfakes, impersonate other users, or commit fraud.

Today, kids are increasingly more present online with much of their social identity tied to their digital life, whether it’s engaging with friends, social media platforms, surfing online forums, or playing video games. More than 90% of children older than the age of two play video games, and children between the ages of eight and 17 spend an average of 1.5 to 2 hours daily playing video games.⁷ How youth are spending their time comes with significant challenges and risks for children’s mental and physical well-being.

One challenge affecting children’s mental and physical well-being is concern around addiction and the significant amount of time children are spending online playing video games. Some regulators are taking a heavy hand in managing this risk. For example, the Chinese government implemented curfews restricting minors’ access to games during certain hours and limiting the amount of time that could be spent on weekdays and weekends. China went as far as requiring gaming platforms to implement name registration capabilities to enforce compliance with the curfews, as well as limiting the amount of in-game purchases.8 It is unclear if China will continue its enforcement of gaming restrictions.

Power up your risk management

Gaming companies should be aware of applicable regulations, determine whether their controls environment effectively and consistently enables compliance, and engage in cross-departmental collaboration to ensure privacy and safety measures are in place to maintain a safe environment for gamers. A misstep, especially when processing children’s data, can result in financial impact and could be extremely damaging to a company’s brand and reputation, not to mention putting the safety of their consumers at risk. There are several recommendations gaming companies can consider to proactively help strengthen their privacy posture.

Understand and manage gamer risks

With the influx of privacy, safety, and AI regulations, gaming companies should take a systematic and programmatic approach to identifying risks and building compliance capabilities. Gaming companies should identify and prioritize existing and emerging regulations and develop a baseline rationalized requirement framework. Risk or impact assessments should be performed to help video game companies truly understand the risks present to their gamers and developers, enabling the businesses to make informed decisions.

Adopt trust-by-design for product and feature launches

Whether you’re developing a virtual reality headset, launching a new modality or product, or integrating GenAI into an existing feature, privacy, safety, and security practices should be accounted for in product development to comply with regulatory obligations and manage risk. Trust-by-design is the process of unifying privacy, security, and safety reviews of products and features prior to launch to demonstrate that the product can be trusted by consumers and regulators. Adopting a trust-by-design approach to product development can enable organizations to meet several data privacy objectives, such as keeping users safe, building trust with consumers, complying with regulatory obligations, protecting user rights, and minimizing online threats and vulnerabilities.

Leverage AI for privacy compliance

Although AI comes with certain risks, it can be a powerful tool for streamlining privacy compliance and safety operations with proper controls in place. As technology companies are racing to develop AI-based offerings, the market for privacy compliance and safety solutions is nascent, and use cases continue to take shape. AI capabilities are expected to emerge to support each phase—including data and compliance management life cycles—to bring more efficiency, consistency, and automation to manual and time-consuming tasks. Use cases are already being explored, such as using AI for data-subject access requests, training and awareness content generation, compliance chatbots, ingestion of regulatory requirements, and prediction of risks based on historical enforcement data and patterns.

Take your gaming privacy to the next level

With the growing wave of emerging and increased regulatory scrutiny, it may no longer be a viable option for gaming companies to be reactive. The enhanced awareness of consumers and importance of trust require companies to place significant time, investment, and resources into their privacy program and trust and safety team. Implementing strategies, programs, and capabilities that address privacy and safety risks and meet regulatory obligations can help garner consumer trust and regulator confidence—and may help protect gaming companies’ margins and enable them to capitalize on opportunities.

Notes:

1Statista Market Insights, video gaming, accessed May 2024.
2United States v. Epic Games, Inc., No. 5:22-CV-00518-BO (4th Cir. Feb. 7, 2023).
316 C.F.R. 312.3.
416 C.F.R. 312.5.
5State of California, AB-2273 California Age-Appropriate Design Code Act (CAADCA).
6Kids Online Safety Act, S. 1409, 118th Cong. (2023–2024).
7Daniel Alanko, “The health effects of video games in children and adolescents,” Pediatrics in Review 44, no. 1 (2023): pp. 23–32.
8Shiona McCallum & Liv McMahon, “China to increase curbs on video gaming industry,” BBC, December 22, 2023.
9IAPP, “Evaluating the use of AI in privacy program operations,” September 27, 2023.

Did you find this useful?

Thanks for your feedback