Current Region:
Global

Legislation

We are increasingly asked how the myriad EU and UK legislation requiring age restrictions online all fit together. This article seeks to explain this. We do so by primarily considering the UK which is leading the way in the SafetyTech sector and its associated legislation and regulation, but it is already apparent that the EU is mirroring many, if not all, of these measures to some degree.

So, first of all, which are the applicable laws when considering online age restrictions?

Contact Law

It is generally not possible for minors under 18 to agree to be contractually bound without a parents’ guarantee. Once a child is 7, they can enter into a contract BUT the law assumes they cannot understand it so are not bound by it (except for contracts for education or necessities which benefit the child), and can cancel it at any time up to their 18th birthday and for a reasonable period thereafter. So, contracts formed with minors are normally both unenforceable and voidable. This includes accepting a website’s terms and conditions so, in fact, even when a 13 year-old opens a social media account, they do not irrevocably agree to its terms e.g. granting to the platform a worldwide license to re-use their content.

Various legacy legislation on the sale of age-restricted goods applying online

These laws apply to alcohol, vaping products, cigarettes, etc. and were created for conventional, physical sales but also apply when sales are made online. Some regulators have offered clarity on how these laws are to be applied online but often there is less certainty and retailers must form a judgement until the courts are given the opportunity to clarify the law in a digital context.

Various legacy legislation on access to age-restricted online services.

These laws apply to online services such as gambling or playing computer games. Notably in the UK, computer games are currently age-restricted when sold as physical items, but not when downloaded online. (There are also voluntary prohibitions on minors, such as that adopted by the Online Dating Association, but these do not have the force of law.)

Advertising restrictions

Increasingly, advertisements for certain goods or services, such as gambling or high fat, sugar and salt (HFSS) foods are subject to age-restrictions. These can be enforced by regulators, such as the Gambling Commission, or through self-regulation, as is the case for the Advertising Standards Authority. Some rules aim to keep the proportion of children seeing ads below a specified level e.g. 25% of the audience, presenting the challenge of both targeting these predominantly at adults and monitoring compliance.

General Data Protection Regulations

These EU-wide and UK rules require special care to be taken when processing the personal data of children under 18, whatever the legal basis for processing may be.

There is also a very specific requirement for parental consent before younger children can give permission (under Article 8) for the processing of their data. This ‘digital age of consent’ varies between Member States of the EU. In the UK it is 13, for example. (Note that consent is only one of a number of legal bases to process personal data, so it is not a general prohibition on processing the data of younger children with their parents’ approval – organisations can also rely on performance of a contract, a legitimate interest, a vital interest, a legal requirement, and a public interest.)

The Audio-Visual Media Services Directive

Online Video Sharing Platforms (VSPs) are required to have in place measures that are appropriate to protect  minors from content which may impair their physical, mental or moral development.

VSPs must establish and operate systems for obtaining assurance as to the age of potential viewers. VSP providers must ensure that restricted material that has the most potential to harm the physical, mental or moral development of children must be subject to the strictest access control measures.

This began life in 1989 as the “Television without Frontiers” directive, and was renamed in 2008, applying a requirement for age verification to linear television channels with adult content.   In 2018 it was extended to cover online VSPs such as Youtube. EU laws have to be transposed into the domestic law of Member states before they are effective; only 4 states did so by the deadline of 19 Sept 2020, but others are progressively catching up under pressure from the European Commission. The UK has put the directive into law, but its regulator, Ofcom, is still consulting on how it will enforce it.

Age-Appropriate Design Code (also known as the Children’s Code)

This is an interpretation by the UK Information Commissioner of the measures in the GDPR which relate to children under 18. The Irish Data Protection xxx has recently published a similar document which, given how many major platforms choose to establish their principal European operations in Ireland, will effectively extend the same concepts across the EU.

Online Safety Bill (UK, not yet law)

This Bill was published in May 2021 and Ministers intend for it to become law in 2022. It imposes a range of legal duties on “user-to-user services” which are defined broadly to include any functionality allowing one users to encounter content from another user.  Predominantly this affects social media platforms, although public search engines are also in scope. Where these services are likely to be accessed by UK children under 18, there is a specific duty to protect them from mental or physical harm.

When do these laws apply

There are few clear dates in the timetable for the introduction of the new laws described above. Once passed, regulators will often take a year or more to draft and consult on guidance before progressively enforcing, perhaps offering a grace period to give services time to make the changes required for compliance.

Our best estimates for forthcoming legislation are set out below.

Some jurisdictions are more important than others. We’ve mentioned Ireland, but Cyprus is also critical for the AVMSD given the number of the largest adult websites based there.

The big picture

The illustration below sums up the combined picture of emerging of the future regulatory eco-system once these measures are all in force.

On this basis, we confidently predict that by 2023, new legislation in the UK and across the EU will make independent, standards-based age checks a foundation of internet safety. Age checks will need  to be applied to any digital service which has the potential to cause harm to children.

The AVPA is leading the design of a pan-European infrastructure for parental consent and age verification, as a pilot project funded by the European Commission at the request of the European Parliament. This interoperable network of providers, www.euCONSENT.eu, will allow such checks to be made with little or no impact on the user experience.