“We have identified that currently, some of the biggest risks come from social media platforms, video and music streaming sites and video gaming platforms. In these sectors, children’s personal data is being used and shared, to bombard them with content and personalised service features. This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online. We’re concerned with a number of harms that could be created as a consequence of this data use, which are physical, emotional and psychological, and financial.”
Stephen Bonner – Executive Director (Regulatory Futures and Innovation), UK ICO Source
Gaming sites have been deemed as high-risk by the UK Information Commissioner’s Office since 2021, so, unless the content on your site is guaranteed to be entirely harmless to players no matter how young they are, you may already be obliged to apply age assurance mechanisms to all those who use your site, with or without an account. More widely, gaming sites are also being brought within the scope of regulation under European Union law. To date, most controls have been through self-regulation, but that is changing rapidly.
EU Wide: GDPR (in force today)
You should be sure that your European (including the UK) players are at least old enough to give consent for their personal data to be processed, if you rely on consent under Article 8 of GDPR, as a basis for processing some or all personal data you obtain from your users. (Remember, personal data even includes just an IP address.) In the UK, this “age of digital consent” is 13 but it varies between EU member states between 13 – 16, so if you have players across the EU, you will need to also determine their location and apply the relevant age as part of this check. Click here to see a map of the digital age of consent provided by our member PRIVO.
This mirrors the US Childrens Online Privavy Protection Act (COPPA) regulation but without the defence of ignorance which applies in the case of sites not directed at children. You should not assume that if you are compliant with COPPA, you are already compliant with GDPR. This could be a very costly error.
Enforcement and penalties
Tools at the disposal of the UK regulator, the Information Commissioner, and its EU counterparts include assessment notices, warnings, reprimands, enforcement notices and penalty notices (administrative fines). For serious breaches of the data protection principles, there is the power to issue fines of up to £17.5 million or 4% of your annual worldwide turnover, whichever is higher.
UK Players Only: Age Appropriate Design Code
This Code has applied, since September 2020, if you if you provide “relevant information society services which are likely to be accessed by children”, as provided by section 123 Data Protection Act 2018 (DPA 2018). Therefore, any organisation that provides services, apps, games and web and social media services where children are likely to access them will fall within scope. These games may not be intended for children at all, but if they are attractive to children and there is evidence of children playing them, they quickly fall into scope.
This statutory guidance, also known as “The Children’s Code” requires gaming sites which process personal data (whether or not this is on the basis of consent or any other legal basis permitted under GDPR) to consider if they could risk the moral, physical or mental well-being of children under 18. And if so, to put in proportional measures to safeguard children and young people. The Code applies to any organisation which targets UK users, whether or not it is in the EEA. So even if your organisation’s headquarters is based outside the UK, you may still be required to comply.
You need to consider the content of your games and the functionality of your site, and ask yourself if it might be harmful to children – so for example:
- Where adults can interact with minors, there is a risk of grooming, the inappropriate exchange of contact details or unacceptable conversations, images, videos etc. Indeed, research shows that this is increasingly a problem between minors as well, so if your gamers can communicate with one another, you will need to take precautions to protect players under 18 from each other as well as adults.
- If your gaming site could facilitate physical encounters by allowing users to communicate with one another, there may also be a physical risk if children agree to meet other people through the service.
- Commercial risks should also be considered- the aggressive promotion of skins, lootboxes etc. which might lead to children spending excessively
- Cyberbulling and other risks from the conduct of players need also to be included in your risk assessment
Our opinion is that almost all gaming sites established in the UK clearly require age verification to be in place to identify children using the site so they can be protected from harmful content.
The level of rigor required is a matter for the judgement of the sites concerned – giving consideration to the nature of the content on the site, the number of users under 18 found to be using it, etc. But given the reputational risk if a child is harmed by your service, we recommend at least a standard level of assurance. See our page on levels of assurance for an explanation of the methods of age verification that achieve this degree of confidence in an age check.
Enforcement and penalties
As for GDPR above.
EU GDPR rules
The Irish Data Protection Commission was quick to follow the UK ICO and published its Fundamentals for a Child-Oriented Approach to Data Processing (the Fundamentals) a year later. There is an important distinction from the UK Code, as the Irish verison is not statutory, it is only guidance. But it is arguably more influential given how many mutlinational platforms are established in Ireland and regulated across the EU by the DPC.
UK but with global effect: Online Safety Bill
This Bill, expect to become law in the Autumn of 2023, imposes a range of legal duties on “user-to-user services” which are defined broadly to include any functionality allowing one users to ‘encounter’ content from another user. Predominantly, this affects social media platforms, although public search engines are also in scope.
Where these services are likely to be accessed by UK children under 18, there is a specific duty to protect them from mental or physical harm. As gaming sites often allow users to interact and share content such as game play or comments, they are in scope for this new Online Safety legislation.
As many Gaming sites are also “likely to be used by Children” they must comply with the further duties applicable. (Remember, children are defined as under 18-years-old.)
The largest gaming sites may be considered Category 1 sites, with additional duties placed on them to offer adults the choice of additional protection from online hards as well
(This will replace the Audio Visual Media Services Directive in the UK. It expands the jurisdiction from video sharing platforms established in the UK to sites globally which are visited by users in the UK.)
Please read our briefing on the Online Safety Bill for further explanation of these new duties (which will be updated when the Bill is enacted as it has been considerably amended by Parliament and is not finalised).
Enforcement and penalties
The regulator, Ofcom, can issue access restriction orders, service restriction orders, or impose an appropriate and proportionate penalty of whichever is the greater of—
- £18 million, and
- 10% of the person’s qualifying worldwide revenue.
USA: Children’s Online Privacy Protection Act (COPPA)
The Children’s Online Privacy Protection Rule seeks to put parents in control of what information commercial websites collect from their children online. It applies globally to sites providing a service to users located in the USA.
You’re covered by COPPA if:
- Your website or online service is directed to children under 13 and collects personal information from them;
- Your website or online service is directed to a general audience, but you have “actual knowledge” you’re collecting personal information from a child under 13. The FTC has said that an operator has actual knowledge of a user’s age if the site or service asks for – and receives – information from the user that allows it to determine the person’s age. For example, an operator who asks for a date of birth on a site’s registration page has actual knowledge as defined by COPPA if a user responds with a year that suggests they’re under 13. An operator also may have actual knowledge based on answers to “age identifying” questions like “What grade are you in?”; or
- You run a third-party service like an ad network or plug-in and you’re collecting information from users of a site or service directed to children under 13.Third-party sites or services may have actual knowledge under COPPA, too. For example, if the operator of a child-directed site directly communicates to an ad network or plug-in about the nature of its site, the ad network or plug-in will have actual knowledge under COPPA. The same holds true if a representative of the ad network or plug-in recognizes the child-directed nature of the site’s content. Another way an ad network or plug-in may have actual knowledge: If a concerned parent or someone else informs a representative of the ad network or plug-in that it’s collecting information from children or users of a child-directed site or service.
Websites and online services covered by COPPA must post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children.