In the United States, a wave of federal bills is attempting to redefine how children, teens, and their parents interact with the digital world. Each proposal takes a different approach to online safety, privacy, and platform responsibility. What varies even more sharply is the standard of age assurance each bill relies upon. Some require verified knowledge, others accept parental attestation or platform metadata, and most continue to operate on self-declared age rather than evidence.
* This review is especially timely, as the House Committee on Energy and Commerce’s Subcommittee on Commerce, Manufacturing, and Trade is scheduled to hold a legislative hearing on several of these measures – including the SCREEN Act, the App Store Accountability Act, and the Kids Online Safety Act – on Tuesday, December 2, 2025.
The SCREEN Act*
The SCREEN Act stands apart from those which follow because it requires pornography websites to verify that users are 18 or older. This is a true age-verification obligation based on a verified-knowledge standard. Pornographic sites must not rely on self-declared age. Unlike the other bills, the SCREEN Act places a clear legal duty on providers to determine age with evidence.
The App Store Accountability Act*
The App Store Accountability Act aims to tighten governance around how children download apps, moving responsibility upstream to Apple and Google. Its central requirement is that app stores create a parental account system with meaningful oversight of child accounts. It ensures that adults agree to terms and conditions when apps are downloaded, not children who cannot legally sign such contracts. Crucially, the Act insists that the parental account holder must be verified as at least 18. This is one of the few federal proposals that explicitly requires age verification (for the adult, not the child). Children’s ages may still be based on parental attestation and platform metadata, so the standard is a mixture of verified adult knowledge and implied knowledge for minors.
The Kids Online Safety Act*
The Kids Online Safety Act has become one of the most debated pieces of digital-safety legislation. It imposes a duty of care on platforms to mitigate risks to minors and sets defaults for safety and parental controls. But it sidesteps the question of how platforms should know who is a minor. In practice, platforms can rely on self-declared age and family-account metadata, with no requirement to verify age for either the parent or the child. The standard is therefore implied knowledge unless a service has actual knowledge that a user is under 17, and there is no penalty for wilful disregard unless the platform has been clearly put on notice.
The Kids Off Social Media Act
The Kids Off Social Media Act takes a different route. It seeks to ban social media accounts for children under 13 and prohibit algorithmic feeds for anyone under 17. However, KOSMA does not introduce any duty to verify a young person’s age. Platforms may rely on self-declared dates of birth, family-account structures and whatever parental attestation they already use. The underlying knowledge standard is therefore actual knowledge that a user is underage, in which case it must act. There is no requirement for age assurance, and platforms are not placed under a more testing ‘wilful-disregard of age information’ standard.
KOSMA’s companion provision, which conditions telecommunications subsidies on schools and libraries blocking access to social media, avoids the age question entirely. These filtering obligations apply regardless of whether the person trying to access the platform is a child or an adult. No knowledge standard is required because no age determination is needed; the responsibility rests entirely on the institution’s network policies.
COPPA 2.0
COPPA 2.0, an update to the long-standing Children’s Online Privacy Protection Act, raises the protected age from under 13 to under 17 and expands privacy rights for teens (defined as ages 13 to under 17). Yet it preserves the fundamental structure of verifiable parental consent. Under COPPA, these consent methods seek to establish parental authority, not parental age. A parent may complete the verification steps without ever proving they are an adult. The child’s age is typically self-declared unless conflicting information emerges. As a result, the standard remains actual knowledge, including where contradictory facts appear, with no requirement for age verification at any stage.
Key Takeaways
Taken together, these bills show the full spectrum of American legislative thinking on age assurance. Only the App Store Accountability Act and the SCREEN Act create verified-knowledge requirements. The others rely on a mix of self-declaration and parental attestation, with duties triggered only when a platform has actual knowledge that a user is underage – an approach that has to date proven wholly ineffective.
For policymakers, platforms, and parents, this distinction matters. Without a requirement for age assurance, the effectiveness of safety obligations will depend almost entirely on what users claim about their age rather than what platforms know or should know.
As of November 27, 2025, these bills remain active in the 119th Congress, with the December 2 hearing poised to advance discussions. For the latest status, visit Congress.gov.