Current Region:

Music Streaming Sites

“We have identified that currently, some of the biggest risks come from social media platforms, video and music streaming sites and video gaming platforms. In these sectors, children’s personal data is being used and shared, to bombard them with content and personalised service features. This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online. We’re concerned with a number of harms that could be created as a consequence of this data use, which are physical, emotional and psychological, and financial.” 

Stephen Bonner – Executive Director (Regulatory Futures and Innovation),  UK ICO Source

Music streaming sites have been deemed as high-risk by the UK Information Commissioner’s Office, so, unless the content on your site is guaranteed to be entirely harmless to listeners no matter how young they are, you may already be obliged to apply age verification mechanisms to all those who use your site, with or without an account.


You should be sure that your users are at least old enough to give consent for their personal data to be processed, if you rely on consent under Article 8 of GDPR, as a basis for processing some or all personal data you obtain from your users.  (Remember, personal data  even includes just an IP address.)  In the UK, this “age of  digital consent” is 13 but it varies between EU member states so if you have users in the EU, you will need to also determine their location and apply the relevant age as part of this check. Click here to see a map of the digital age of consent provided by our member PRIVO.

You may simply be tracking the music selections of a guest user, and using these to serve them with more music that may suit their taste, and potentially the selection of adverts to fund your service.  But if the user concerned is under the digital age of consent, then you are operating in breach of GDPR unless you’ve secured parental consent for processing data this way.  Regulators are particularly concerned about the use of children’s data to target advertising.

In addition, using children’s data to select  potentially harmful audio content will also be a breach of GDPR (see Age Appropriate Design Code below).

Enforcement and penalties

Tools at the disposal of the regulator, the Information Commissioner, include assessment notices, warnings, reprimands, enforcement notices and penalty notices (administrative fines). For serious breaches of the data protection principles, there is the power to issue fines of up to £17.5 million or 4% of your annual worldwide turnover, whichever is higher.

USA: Children’s Online Privacy Protection Act (COPPA)

The Children’s Online Privacy Protection Rule seeks to put parents in control of what information commercial websites collect from their children online.  It applies globally to sites  providing a service to users located in the USA.

You’re covered by COPPA if:

  1. Your website or online service is directed to children under 13 and collects personal information from them;
  2. Your website or online service is directed to a general audience, but you have “actual knowledge” you’re collecting personal information from a child under 13.  The FTC has said that an operator has actual knowledge of a user’s age if the site or service asks for – and receives – information from the user that allows it to determine the person’s age.  For example, an operator who asks for a date of birth on a site’s registration page has actual knowledge as defined by COPPA if a user responds with a year that suggests they’re under 13.  An operator also may have actual knowledge based on answers to “age identifying” questions like “What grade are you in?”; or
  3. You run a third-party service like an ad network or plug-in and you’re collecting information from users of a site or service directed to children under 13.Third-party sites or services may have actual knowledge under COPPA, too.  For example, if the operator of a child-directed site directly communicates to an ad network or plug-in about the nature of its site, the ad network or plug-in will have actual knowledge under COPPA.  The same holds true if a representative of the ad network or plug-in recognizes the child-directed nature of the site’s content.  Another way an ad network or plug-in may have actual knowledge:  If a concerned parent or someone else informs a representative of the ad network or plug-in that it’s collecting information from children or users of a child-directed site or service.

Websites and online services covered by COPPA must post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children.

So you may permit children under 13 to listen to music on your service, but if they are located in the USA, and you learn that they are under 13, then you need to comply with the provisions of COPPA, and will need to secure parental consent to retain any data about these users.

UK Only: Age Appropriate Design Code

This statutory guidance,  also known as “The Children’s Code” requires online services which process  personal data (whether or not this is on the basis of consent or any other reason permitted under GDRP) to consider if they could risk the moral, physical or mental well-being of children under 18.  And if so, to put in proportional measures to safeguard children and young people.

It is relatively new, with a 12 month grace period preceding enforcement action ending in September 2021, and the ICO now actively dealing with complaints about breaches of the Code.

You need to consider the content of your site, and ask yourself how any current or future content might be harmful to children – so for example:

  • Is the music available on your site potentially harmful to children by, for example, promoting violence, eating disorders, self-harm or suicide.
  • Where adults can interact with minors, there is a risk of grooming, the inappropriate exchange of photographs and conversations etc.  Indeed, research shows that this is increasingly a problem between minors as well.
  • If music streaming sites facilitate could physical encounters by allowing users to communicate with one another, there may also be a physical risk if children agree to meet other people through the service’s chat functions.

Our opinion is that music streaming sites established in the UK clearly require age verification to be in place to identify children using the site so they can be protected from harmful audio content.

The level of rigor required is a matter for the judgement of the sites concerned – giving consideration to the nature of the audio and other content on the site, the number of users under 18 found to be using it, etc.  But given the reputational risk if a child is harmed by your service, we recommend at least a standard level of assurance.  See our page on levels of assurance for an explanation of the methods of age verification that  achieve this degree of confidence in an age check.

Enforcement and penalties

As for GDPR above.

UK but with global effect: Online Safety Bill

This Bill, expect to become law in the Autumn of 2023, imposes a range of legal duties on “user-to-user services” which are defined broadly to include any functionality allowing one users to ‘encounter’ content from another user.  Predominantly, this affects social media platforms, although public search engines are also in scope.

Where these services are likely to be accessed by UK children under 18, there is a specific duty to protect them from mental or physical harm.  As gaming sites often allow users to interact and share content such as game play or comments, they are in scope for this new Online Safety legislation.

As many Gaming sites are also “likely to be used by Children”  they must comply with the further duties applicable. (Remember, children are defined as under 18-years-old.)

The largest gaming sites may be considered Category 1 sites, with additional duties placed on them to offer adults the choice of additional protection from online hards as well

(This will replace the Audio Visual Media Services Directive in the UK.  It expands the jurisdiction from video sharing platforms established in the UK to sites globally which are visited by users in the UK.)

Please read our briefing on the Online Safety Bill for further explanation of these new duties (which will be updated when the Bill is enacted as it has been considerably amended by Parliament and is not finalised).

Enforcement and penalties

The regulator, Ofcom, can issue access restriction orders, service restriction orders, or impose an appropriate and proportionate penalty of whichever is the greater of—

  • £18 million, and
  • 10% of the person’s qualifying worldwide revenue.

Posted on

October 11, 2021

Submit a Comment

Your email address will not be published. Required fields are marked *