Current Region:
Global

Computer Gaming

“Some of the biggest risks to children arise on social media platforms, video and music streaming services, and video gaming platforms. In these sectors, children’s personal data is used and shared to shape content, recommendations, and service features. This can expose them to inappropriate advertising, unsolicited contact, and design choices that encourage excessive use. These practices can create physical, emotional, psychological, and financial harms.”

Stephen Bonner, Executive Director, Regulatory Futures and Innovation
Information Commissioner’s Office

Why gaming platforms are high risk

The ICO has treated online games and gaming platforms as high-risk services for children since 2021. This reflects the combination of immersive content, social interaction, monetisation mechanics, and data-driven design that characterises modern games.

Unless a game and its surrounding services are demonstrably harmless for users of any age, operators should assume that children are present and design accordingly. In practice, this often requires age assurance, whether or not players create accounts.

UK legal framework

UK GDPR and children’s data

Under UK GDPR, children’s personal data requires enhanced protection. Personal data includes IP addresses, device identifiers, behavioural data, and in-game telemetry.

If you rely on consent as a lawful basis for processing personal data, Article 8 requires that users are old enough to give valid consent. In the UK, the digital age of consent is 13. Below that age, verifiable parental consent is required.

If a gaming platform does not know a user’s age or location, it cannot know whether consent is valid. As a result, age assurance is often necessary at account creation or before data-driven features are enabled.

UK GDPR does not include a COPPA-style “lack of knowledge” defence. Assuming compliance with US rules is therefore not sufficient to ensure UK or EU compliance.

Enforcement and penalties

The ICO can issue assessment notices, enforcement notices, and administrative fines of up to £17.5 million or 4% of global annual turnover, whichever is higher.

Age Appropriate Design Code

The Age Appropriate Design Code, also known as the Children’s Code, applies to any online service likely to be accessed by children under 18 that processes personal data.

For gaming platforms, this includes services that are not intended for children but are attractive to them or have evidence of child users.

The Code requires operators to:

• Assess whether games, features, and monetisation practices pose risks to children
• Apply proportionate safeguards based on age and risk
• Minimise profiling, nudging, and exploitative design where children are present

The Code applies to services targeting UK users regardless of where the provider is established.

Online Safety Act 2023

The Online Safety Act 2023 applies to gaming platforms that allow users to encounter content from others, including chat, voice, shared gameplay, user-generated content, and social hubs.

Where a gaming service is likely to be accessed by children, operators must:

• Assess the risk of harm to under-18s
• Prevent children from encountering harmful content
• Apply age assurance where necessary to apply protections effectively

Because many games enable interaction between players, most modern gaming platforms fall within scope.

The regulator, Ofcom, can issue enforcement notices, impose access or service restriction orders, and levy fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.

Key risks gaming platforms must address

Gaming platforms should consider whether their content or functionality could harm children, including:

• Interaction between adults and minors, creating risks of grooming or inappropriate contact
• Peer-to-peer abuse, bullying, or coercion between minors
• Features that could facilitate physical meetings between players
• Monetisation mechanics such as loot boxes, skins, and in-game purchases that may encourage excessive spending
• Exposure to violent, sexual, or otherwise age-inappropriate content

Risk assessments must reflect how the service is actually used, not only how it is intended to be used.

Duty to apply and enforce minimum age rules

Many gaming platforms set minimum ages in their terms and conditions. Under UK law, it is not sufficient to state an age limit. Platforms are expected to take reasonable steps to apply and enforce their own rules.

If a platform does not know whether a player meets its stated minimum age, it cannot credibly rely on its terms for compliance with data protection or online safety law. Regulators increasingly focus on outcomes, not policy statements.

Our view

Almost all gaming platforms accessible from the UK should assume that children are present unless they have strong evidence to the contrary.

Identifying child users is therefore a prerequisite to complying with UK data protection law and online safety duties.

The appropriate level of assurance depends on the nature of the game, its features, and its audience. However, given regulatory expectations and reputational risk, we generally recommend at least a standard level of age assurance for gaming platforms that include social interaction, monetisation, or personalised features.


PLEASE NOTE
This website does not constitute legal advice. You should always seek independent legal advice on compliance matters.

Skills

Posted on

May 17, 2021

Submit a Comment