“We have identified that currently, some of the biggest risks come from social media platforms, video and music streaming sites and video gaming platforms. In these sectors, children’s personal data is being used and shared, to bombard them with content and personalised service features. This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online. We’re concerned with a number of harms that could be created as a consequence of this data use, which are physical, emotional and psychological, and financial.”
Stephen Bonner – Executive Director (Regulatory Futures and Innovation), UK ICO Source
Gaming sites have been deemed as high-risk by the UK Information Commissioner’s Office, so, unless the content on your site is guaranteed to be entirely harmless to players no matter how young they are, you may already be obliged to apply age verification mechanisms to all those who use your site, with or without an account. More widely, gaming sites are also being brought within the scope of regulation under European Union law. To date, most controls have been through self-regulation, but that is changing rapidly.
EU Wide: GDPR (in force today)
You should be sure that your European players are at least old enough to give consent for their personal data to be processed, if you rely on consent under Article 8 of GDPR, as a basis for processing some or all personal data you obtain from your users. (Remember, personal data even includes just an IP address.) In the UK, this “age of digital consent” is 13 but it varies between EU member states so if you have players across the EU, you will need to also determine their location and apply the relevant age as part of this check. Click here to see a map of the digital age of consent provided by our member PRIVO.
This mirrors the US COPPA regulation but without the defence of ignorance.
Enforcement and penalties
Tools at the disposal of the regulator, the Information Commissioner, include assessment notices, warnings, reprimands, enforcement notices and penalty notices (administrative fines). For serious breaches of the data protection principles, there is the power to issue fines of up to £17.5 million or 4% of your annual worldwide turnover, whichever is higher.
UK Players Only: Age Appropriate Design Code
The is in force today, but a grace period is in operation until 2 September 2021.The code applies to you if you if you provide “relevant information society services which are likely to be accessed by children”, as provided by section 123 Data Protection Act 2018 (DPA 2018). Therefore, any organisation that provides services, apps, games and web and social media services where children are likely to have access will fall within scope.
This statutory guidance, also known as “The Children’s Code” requires gaming sites which process personal data (whether or not this is on the basis of consent or any other reason permitted under GDPR) to consider if they could risk the moral, physical or mental well-being of children under 18. And if so, to put in proportional measures to safeguard children and young people. The code applies to any organisation which targets UK users, whether or not it is in the EEA. So even if your organisation’s headquarters is based outside the UK, you may still be required to comply.
You need to consider the content of your games and the functionality of your site, and ask yourself if it might be harmful to children – so for example:
- Where adults can interact with minors, there is a risk of grooming, the inappropriate exchange of contact details or unacceptable conversations etc. Indeed, research shows that this is increasingly a problem between minors as well, so if your gamers can communicate with one another, you will need to take precautions to protect players under 18.
- If your gaming site could facilitate physical encounters by allowing users to communicate with one another, there may also be a physical risk if children agree to meet other people through the service.
Our opinion is that gaming sites established in the UK clearly require age verification to be in place to identify children using the site so they can be protected from harmful content.
The level of rigor required is a matter for the judgement of the sites concerned – giving consideration to the nature of the content on the site, the number of users under 18 found to be using it, etc. But given the reputational risk if a child is harmed by your service, we recommend at least a standard level of assurance. See our page on levels of assurance for an explanation of the methods of age verification that achieve this degree of confidence in an age check.
Enforcement and penalties
As for GDPR above.
UK but with global effect: Online Safety Bill
This Bill, currently only a draft, will replace the AVMSD in the UK. It expands the jurisdiction from sites established in the UK to sites globally which are visited by users in the UK.
The Bill was published in May 2021 and Ministers intend for it to become law in 2022. It imposes a range of legal duties on “user-to-user services” which are defined broadly to include any functionality allowing one users to ‘encounter’ content from another user. Predominantly, this affects social media platforms, although public search engines are also in scope.
Where these services are likely to be accessed by UK children under 18, there is a specific duty to protect them from mental or physical harm. As gaming sites often allow users to interact and share content such as game play or comments, they are in scope for this new Online Safety legislation.
As many Gaming sites are also “likely to be used by Children” they must comply with the further duties applicable. (Remember, children are defined as under 18-years-old.)
The largest gaming sites may be considered Category 1 sites, with additional duties placed on them to protect adults as well.
Please read our briefing on the Online Safety Bill for further explanation of these new duties.
Enforcement and penalties
The regulator, Ofcom, can issue access restriction orders, service restriction orders, or impose an appropriate and proportionate penalty of whichever is the greater of—
- £18 million, and
- 10% of the person’s qualifying worldwide revenue.
USA: Children’s Online Privacy Protection Act (COPPA)
The Children’s Online Privacy Protection Rule seeks to put parents in control of what information commercial websites collect from their children online. It applies globally to sites providing a service to users located in the USA.
You’re covered by COPPA if:
- Your website or online service is directed to children under 13 and collects personal information from them;
- Your website or online service is directed to a general audience, but you have “actual knowledge” you’re collecting personal information from a child under 13. The FTC has said that an operator has actual knowledge of a user’s age if the site or service asks for – and receives – information from the user that allows it to determine the person’s age. For example, an operator who asks for a date of birth on a site’s registration page has actual knowledge as defined by COPPA if a user responds with a year that suggests they’re under 13. An operator also may have actual knowledge based on answers to “age identifying” questions like “What grade are you in?”; or
- You run a third-party service like an ad network or plug-in and you’re collecting information from users of a site or service directed to children under 13.Third-party sites or services may have actual knowledge under COPPA, too. For example, if the operator of a child-directed site directly communicates to an ad network or plug-in about the nature of its site, the ad network or plug-in will have actual knowledge under COPPA. The same holds true if a representative of the ad network or plug-in recognizes the child-directed nature of the site’s content. Another way an ad network or plug-in may have actual knowledge: If a concerned parent or someone else informs a representative of the ad network or plug-in that it’s collecting information from children or users of a child-directed site or service.
Websites and online services covered by COPPA must post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children.