Current Region:
Global

Video Sharing Platforms

“Some of the biggest risks to children arise on social media platforms, video and music streaming services, and video gaming platforms. In these sectors, children’s personal data is used and shared to shape content, recommendations, and features. This can expose them to inappropriate advertising, unsolicited contact, and design choices that encourage excessive use. These practices can create physical, emotional, psychological, and financial harms.”

Stephen Bonner, Executive Director, Regulatory Futures and Innovation
Information Commissioner’s Office

Why video-sharing platforms are high risk

Video-sharing platforms combine mass distribution of content with recommendation systems, social interaction, advertising, and monetisation of attention. Where children are present, this creates heightened risks including exposure to harmful material, grooming, peer-to-peer abuse, and the amplification of distressing or extreme content through algorithms.

Unless a service can demonstrate that all content and functionality are suitable for users of any age, operators should assume that children are likely to access the service and design accordingly. In practice, this generally requires age assurance.

UK legal framework

Online Safety Act 2023

The Online Safety Act 2023 is now the primary regulatory framework for video-sharing platforms in the UK, replacing the AVMSD regime.

It applies to user-to-user services and video-sharing platforms that are accessible from the UK, regardless of where they are established.

Where a service is likely to be accessed by children, it must:

• Assess the risk of harm to under-18s
• Prevent children from encountering content that is harmful to them
• Use age assurance where necessary to apply protections effectively
• Ensure that content restricted to adults is not accessible to children

Age assurance is not optional where it is required to distinguish adults from children in order to apply safeguards.

The regulator, Ofcom, has powers to issue enforcement notices, impose access or service restriction orders, and levy fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.

UK GDPR and children’s data

Under UK GDPR, children’s personal data requires specific protection. Personal data includes identifiers such as IP addresses, device identifiers, and behavioural data.

If you rely on consent as a lawful basis for processing personal data, Article 8 requires that users are old enough to give valid consent. In the UK, the digital age of consent is 13. Below that age, verifiable parental consent is required.

Regulators are particularly concerned about:

• Profiling and recommendation systems applied to children
• Targeted or personalised advertising using children’s data
• Data-driven amplification of harmful or extreme video content

Age Appropriate Design Code

The Age Appropriate Design Code, also known as the Children’s Code, applies to any online service likely to be accessed by children under 18 that processes personal data.

For video-sharing platforms, this includes obligations to:

• Assess whether content, algorithms, and social features pose risks to children
• Apply proportionate safeguards based on age and risk
• Minimise nudging, autoplay, and engagement-maximising design where children are present

The Code is fully in force and actively enforced by the ICO.

What this means in practice

Video-sharing platforms serving the UK market should assume that children are present unless they have strong evidence otherwise. Identifying child users is therefore a prerequisite to compliance with both the Online Safety Act and UK data protection law.

Age assurance enables platforms to:

• Apply stricter access controls to adult or harmful content
• Adjust recommendation systems and defaults for children
• Disable or limit social and messaging features where appropriate

The appropriate level of rigour depends on the nature of the content, functionality, and audience. However, given regulatory expectations and reputational risk, we generally recommend at least a standard level of age assurance for video-sharing platforms hosting user-generated content or adult-only material.

European Union

Under the Audiovisual Media Services Directive, video-sharing platforms must take appropriate measures to protect minors from content that may impair their physical, mental, or moral development. This includes age verification or equivalent access controls for the most harmful material.

In parallel, the General Data Protection Regulation requires parental consent for processing children’s data below the national digital age of consent, which varies between 13 and 16 across Member States.

United States

The Children’s Online Privacy Protection Act applies where services are directed at children under 13 or where operators have actual knowledge that they are collecting personal data from under-13s in the US. Verifiable parental consent is required in such cases.

PLEASE NOTE
This website does not constitute legal advice. You should always seek independent legal advice on compliance matters.

Skills

Posted on

May 17, 2021

Submit a Comment