Current Region:
Global

The “Parents Over Platforms Act” Has Gotten Its Name Backwards

April 24, 2026

The Parents Over Platforms Act was named backwards. It claims to put parents in control. The irony is that the platforms this proposed US Federal bill targets need do almost nothing new to comply with it, leaving parents no better off than today — while the Big Tech app store operators (primarily Apple and Google) that control access to almost every phone and tablet quietly gain yet more data and power. It should have been called the Platforms Over Parents Act.

Recent multi-million dollar judgments against large digital services have revealed that platforms have failed to protect children from an array of harms. Those cases succeeded because platforms could point to no statutory safe harbour — no prescribed compliance process that discharged their duty of care. Silicon Valley will be delighted that this bill would change that — not by making platforms more accountable, but by creating exactly that shield, built on weak, unverified self-declaration. As drafted, it requires almost nothing genuinely new from app stores or the developers and platforms that use them, shields them from accountability when things go wrong, and uses federal preemption to prevent states from doing better.

The Age Verification Providers Association — whose members collectively perform more than one billion age checks annually under strict regulatory obligations — believes good federal legislation on this issue is both necessary and achievable. But this bill, as written, is not it. We urge the Senate to look carefully at what POPA actually requires before concluding it delivers what its title promises.

The key problems in plain terms:

  • It does not require age verification — only age declaration. Asking a user their age is mandatory. Actually checking it is entirely optional. A child can inflate their age and nothing in this bill stops them. (Newsflash: kids lie.)
  • It gives app stores a liability shield. If an age signal is wrong and a child is harmed, the app store is protected provided it made a “good faith effort.” The same protection flows to developers – social media platforms, AI chatbots and others – who relied on that signal. Nobody is left accountable.
  • It requires app stores to do almost nothing new. They already ask users their age at account setup, already offer parental controls, already share some age data. This bill codifies what they do today and calls it child protection, creating a dangerous false sense of safety.
  • Most social media platforms may not be covered at all. Developer obligations only apply to apps that proactively offer different experiences by age. A platform that treats all users identically – or quickly changes how it operates to do so – faces zero obligations under this bill, regardless of the harm it causes to minors.
  • Websites without an associated app are entirely outside its scope. A standalone website is simply not covered, however harmful its content. A child blocked from an app can open a browser and access identical content without restriction. Protections should be placed as close as possible to high-risk services, not upstream in the app store where they can be bypassed with a single tap.
  • It preempts every state law. California, Colorado, Texas, Utah and others have enacted or are developing age assurance frameworks. This bill would sweep all of them away and replace them with a weaker federal standard that platforms already satisfy by doing what they already do.
  • It creates a new hidden surveillance risk. For an age signal to carry legal weight and support the safe harbour the bill creates, app stores will have a strong incentive to anchor those signals in hardware attestation, locking them as the critical gatekeepers for access to age-restricted services across the entire app ecosystem, and creating new, unregulated opportunities to track and profile their users’ behaviour across every age-restricted service they access.

We will know how weak this structure is only after it has replaced more effective laws and mechanisms. By then, the evidence will be in the form of more damaged children.

Why this matters for enforcement

Under POPA, once a developer receives an app store age signal saying a user is an adult, the bill designates that as a commercially reasonable effort at age determination. The behavioural signals that have succeeded in recent enforcement actions would need to clear a much higher bar to override it. POPA does not strengthen accountability for harmful platforms. It gives them a federal shield to rival Section 230 at precisely the moment accountability matters most.

The hidden surveillance consequence

There is a further consequence that has received little attention. For an age signal to carry legal weight and to support the safe harbour this bill creates, app stores (primarily Apple and Google) will have a strong incentive to anchor those signals in hardware attestation: a cryptographic process that ties the age signal to a specific device and account. This makes app stores the critical infrastructure through which access to age-restricted services must flow.

The bill places no constraints on how long app stores may retain records of those signal requests, how they may be used, or what patterns of behaviour they reveal over time. Every covered app a user opens, every age-related transaction they complete, becomes a data point held by the app store operator with no regulatory limit on its use. Unlike a browser cookie, this cannot be cleared. The same hardware-bound credential that vouches for your age to one service can allow your activity to be linked across every age-restricted service on the platform – a form of persistent tracking that is a direct consequence of centralising age assurance inside a few global Big Tech companies.

The bill does not just give app stores a central role in age assurance, it potentially gives them a permanent, detailed record of their users’ activity across every age-restricted service on their platform, with no obligation to account for what they do with it.

What genuine child protection looks like

Age verification that actually works exists today. Independent, regulated providers perform more than a billion age checks annually under strict data protection obligations. They verify age accurately against real documents or trusted data sources and deliver a reusable token that proves a user’s eligibility without exposing their identity to the platform, and without routing every transaction through an app store account.

That model keeps platforms fully accountable for who they admit. It does not create a centralised app store infrastructure at the heart of every age-related transaction. It does not give app stores a persistent record of user activity across age-restricted services. And it does not give the largest gatekeepers in the digital economy a compliance shield built on unverified self-declaration.

The federal preemption problem

Perhaps the most significant issue is one that has received little attention in the debate around this bill. Section 203 of POPA preempts all state laws related to its provisions. If this bill passes, it removes the stronger obligations that state legislatures seek to impose and replaces them with a federal floor so low that the largest platforms in the world can clear it without changing their behaviour in any meaningful way.

Legislators who care about federalism and states’ rights should look carefully at this provision. A bill that uses federal authority to prevent states from protecting their own children more effectively than Washington is prepared to, is not a child safety bill. It is a platform protection bill with a child safety label.

Our position

The Age Verification Providers Association supports robust federal age assurance legislation, and thinks carefully before opposing any measure that may help protect children. We believe a national framework, done well, would be better than a patchwork of inconsistent state laws. But done well means requiring genuine age assurance certified to international standards, not just declaration. It means ensuring accountability aligns with control over risk. It means creating space for independent, privacy-preserving, interoperable solutions rather than consolidating every age-related transaction inside app stores. It means constraining what operating systems can retain and learn from the age signal infrastructure this bill would make them responsible for. And it means not using federal preemption to lower the bar that states have already set.

POPA as drafted does none of these things and because it will delay or replace laws that are more effective at keeping children safer online, it puts them at greater risk. The AVPA therefore opposes this bill, but stands ready to work with sponsors and committee staff on a version that genuinely delivers what its title promises.