“Some of the biggest risks to children arise on social media platforms, video and music streaming services, and video gaming platforms. In these sectors, children’s personal data is used and shared to shape content, features, and recommendations. This can expose them to inappropriate advertising, unsolicited contact, and design choices that encourage excessive use. These practices can create physical, emotional, psychological, and financial harms.”
Stephen Bonner, Executive Director, Regulatory Futures and Innovation
Information Commissioner’s Office
Why music streaming services are in scope
The ICO has explicitly identified music and audio streaming services as potentially high-risk where they process children’s data, personalise content, enable interaction, or monetise attention. Unless a service is demonstrably harmless for users of any age, operators should assume that children are likely to access it and design accordingly.
In practice, this means that many UK-facing music streaming services are already expected to identify child users and apply proportionate safeguards. In most cases, this requires some form of age assurance.
UK GDPR and children’s data
Under the UK General Data Protection Regulation, children merit specific protection, particularly where services rely on profiling, recommendation engines, behavioural analytics, or advertising.
If you rely on consent as a lawful basis for processing personal data, Article 8 requires that the user is old enough to give valid consent. In the UK, the digital age of consent is 13. Personal data includes identifiers such as IP addresses, device IDs, and usage data.
If a user is under 13, you must obtain verifiable parental consent before processing their personal data on the basis of consent. Regulators are especially concerned about the use of children’s data for targeted or personalised advertising.
Even where consent is not the lawful basis, the processing of children’s data must still be fair, proportionate, and demonstrably in the child’s best interests.
Enforcement and penalties
The ICO has powers including assessment notices, enforcement notices, and administrative fines. For serious breaches, fines can reach £17.5 million or 4% of global annual turnover, whichever is higher.
UK-only: Age Appropriate Design Code
The Age Appropriate Design Code, also known as the Children’s Code, is statutory guidance issued by the ICO. It applies to any online service likely to be accessed by children under 18 that processes personal data.
It requires services to:
• Assess whether their content, features, and data practices pose risks to children
• Apply proportionate safeguards based on age and risk
• Minimise data use, profiling, and nudging where children are involved
For music streaming services, relevant risks may include exposure to harmful audio content, recommendation systems that amplify distressing themes, and messaging or social features that enable inappropriate contact or grooming.
The ICO now actively enforces the Code.
UK with global effect: Online Safety Act 2023
The Online Safety Act 2023 imposes legally binding duties on online services that allow users to encounter content from others. This includes many gaming platforms, social features within streaming services, and any service with user-to-user interaction.
Where a service is likely to be accessed by children, it must:
• Assess the risk of harm to under-18s
• Prevent children from encountering content that is harmful to them
• Use age assurance where necessary to apply protections effectively
The Act applies to services globally if they have a significant number of UK users or target the UK market.
The regulator, Ofcom, can impose access restriction orders and fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.
Our view
Music streaming services established in or targeting the UK should assume that children are present unless they have strong evidence to the contrary. Identifying child users is therefore a prerequisite to complying with UK data protection law and online safety duties.
The appropriate level of age assurance depends on risk, content, and user demographics. However, given regulatory expectations and reputational risk, we generally recommend at least a standard level of age assurance for services offering personalised or potentially harmful audio content.
PLEASE NOTE
This website does not constitute legal advice. You should always seek independent legal advice on compliance matters.