Advertising platforms are increasingly regulated indirectly, through the legal obligations imposed on the advertisers and services they support. Where products or services are age-restricted, advertising platforms are expected to ensure that ads are not delivered to children, and that children’s personal data is not used inappropriately.
This is no longer a purely contractual or ethical issue. In the UK, it is now a matter of online safety law, advertising regulation, and data protection.
“Some of the biggest risks to children arise on platforms that use data-driven systems to target content, advertising, and features. These risks include exposure to inappropriate material, unwanted contact, and design practices that encourage excessive use, with physical, emotional, psychological, and financial consequences.”
Stephen Bonner, Executive Director, Regulatory Futures and Innovation
Information Commissioner’s Office
How advertising platforms become regulated
Advertising platforms are rarely regulated in isolation. Instead, obligations arise where:
• Advertisers are legally prohibited from marketing to children
• Products or services may only be advertised to adults or predominantly adult audiences
• Advertising relies on profiling, targeting, or optimisation using personal data
• Platform design or defaults risk exposing children to restricted advertising
Common examples include gambling, alcohol, high-fat high-sugar foods, dating services, financial products, and adult services.
In these cases, platforms must be able to distinguish between adults and children with sufficient confidence to ensure compliance.
UK legal framework
Online Safety Act 2023
The Online Safety Act 2023 applies to online services that allow users to encounter content from others, including advertising content delivered algorithmically.
Where advertising forms part of a service likely to be accessed by children, platforms must:
• Assess the risk of harm to under-18s arising from advertising
• Prevent children from encountering harmful or age-restricted ads
• Use age assurance where necessary to apply protections effectively
This includes paid advertising, sponsored content, influencer marketing, and promoted posts delivered through platform systems.
The regulator, Ofcom, has powers to impose access restriction orders and fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is higher.
Advertising standards: CAP and ASA
Under the UK advertising self-regulatory system, advertisers must not target or expose children to age-restricted advertising such as gambling or alcohol.
In practice, this places expectations on advertising platforms to provide and correctly apply:
• Age-targeting and exclusion tools
• Controls to prevent proxy targeting of children
• Placement safeguards to avoid child-appealing contexts
Failure by platforms to support compliance can result in advertiser sanctions, platform scrutiny, and referral to statutory regulators.
UK GDPR and children’s data
Advertising platforms processing personal data are subject to UK GDPR. Children’s data attracts enhanced protection.
Key points include:
• Personal data includes IP addresses, device identifiers, and behavioural signals
• Profiling and targeted advertising involving children is high-risk processing
• Where consent is relied upon, the UK digital age of consent is 13
Even where consent is not the lawful basis, platforms must be able to demonstrate that their data use is fair, proportionate, and in the best interests of children where they are present.
Enforcement and penalties
The ICO can issue enforcement notices and administrative fines of up to £17.5 million or 4% of global annual turnover, whichever is higher.
Age Appropriate Design Code
The Age Appropriate Design Code, also known as the Children’s Code, applies to advertising platforms where children are likely to be users.
It requires platforms to:
• Assess how ad targeting, optimisation, and nudging may affect children
• Avoid using children’s data in ways that undermine their wellbeing
• Apply age-appropriate defaults and safeguards
This applies regardless of whether the platform’s primary purpose is advertising.
What this means in practice
Advertising platforms serving the UK market should assume that children are present unless they have strong evidence to the contrary.
To comply with UK law, platforms may need to:
• Know whether an ad recipient is a child or an adult
• Apply different ad policies based on age
• Prevent delivery of restricted advertising to under-18s
• Limit or disable profiling where children are involved
The required level of assurance depends on the risk associated with the advertised product or service. Where advertising is prohibited or tightly restricted for children, self-declaration or weak signals are unlikely to be sufficient.
Given regulatory expectations and reputational risk, we generally recommend at least a standard level of age assurance for platforms serving age-restricted advertising.
PLEASE NOTE
This website does not constitute legal advice. You should always seek independent legal advice on compliance matters.