Current Region:
Global

USA: A Look at Key Kids Online Safety Federal Bills

In the United States, a wave of federal bills is attempting to redefine how children, teens, and their parents interact with the digital world. Each proposal takes a different approach to online safety, privacy, and platform responsibility. What varies even more sharply is the standard of age assurance each bill relies upon. Some require verified knowledge, others accept parental attestation or platform metadata, and most continue to operate on self-declared age rather than evidence.

The debate has intensified following recent legislative activity in the House of Representatives, including committee mark-ups of several proposals that would significantly reshape the regulatory landscape for both adult content services and social media platforms.

Notably, these proposals introduce the concept of “technology verification measures” into federal legislation, explicitly recognising age assurance technologies as a legitimate regulatory tool and directing federal agencies to evaluate their effectiveness, privacy protections and security characteristics. This represents the first time such terminology has appeared in proposed federal law.

Together, the bills illustrate the emerging spectrum of policy approaches in the United States.

 

The SCREEN Act

The SCREEN Act stands apart from most other proposals because it requires pornography websites to verify that users are 18 or older before granting access. This is a true age-verification obligation based on a verified-knowledge standard. Pornographic sites must not rely on self-declared age, and providers must determine age using evidence.

Unlike several other proposals that focus on platform duties or parental oversight, the SCREEN Act places a direct legal obligation on adult-content providers to determine user age. In effect, it mirrors the regulatory model now emerging across multiple U.S. states that require age verification for commercial pornography websites.

 

Kids Internet and Digital Safety Act (KIDS Act)

The Kids Internet and Digital Safety Act (HR 7757) focuses on platforms where a substantial portion of content is harmful to minors. The bill defines “covered platforms” as publicly accessible websites or services where more than one-third of available material consists of sexual material harmful to minors, and where the operator knowingly makes that material available.

Key elements include:

  • Recognising and evaluating the use of “technology verification measures”, including age-verification technologies

  • Directing federal agencies to study the effectiveness, privacy and security implications of such technologies

  • Applying obligations primarily to commercial pornography platforms meeting the one-third threshold

The bill would effectively reinforce the emerging regulatory framework already developing at the state level for adult-content providers, rather than replacing it. However, the proposal also introduces a significant federal pre-emption clause that would prevent states from requiring social media platforms to deploy technology-based age verification systems to block minors from accessing harmful material. If enacted in its current form, the legislation could create a two-track regulatory structure in the United States:

Adult Content Platforms:

State laws requiring age verification for commercial pornography websites would likely remain intact and operate alongside the federal framework.

Social Media Platforms:

States would likely be prevented from mandating technology-based age checks for social media platforms or other user-generated content services, shifting authority over such measures to the federal government.

For the age-assurance sector, the implications are mixed. The bill legitimises age-verification technology within federal law, but its pre-emption clause could halt the growing wave of state-level age-verification proposals targeting social media platforms.

Latest Action: Committee voted to approve the bill on March 5, 2026

 

The App Store Accountability Act

The App Store Accountability Act (HR 3149) aims to shift responsibility for child-safety governance upstream to mobile app stores, primarily Apple’s App Store and Google Play.

The proposal would require app stores to:

  • Determine the age category of users

  • Obtain parental consent before minors download apps

  • Provide developers with an age-category signal so apps know whether a user is a child, teenager, or adult.

Crucially, the Act requires that the parental account holder be verified as at least 18 years old, creating one of the few federal proposals that explicitly requires age verification for adults. However, the child’s age may still be based on parental attestation or platform metadata, meaning the system combines verified knowledge of the adult with implied knowledge for minors.

The federal proposal largely mirrors the state-level App Store Accountability Acts first introduced in Utah, but would replace the emerging patchwork of state laws with a single federal framework enforced by the Federal Trade Commission. The bill is widely expected to face constitutional challenges, particularly on First Amendment grounds, although supporters argue that Congress may receive greater judicial deference than state legislatures.

 

The Kids Online Safety Act (KOSA)

The Kids Online Safety Act remains one of the most debated digital-safety proposals in the United States. KOSA imposes a duty of care on platforms to mitigate risks to minors and requires stronger safety settings, parental tools and transparency obligations. However, the bill sidesteps the question of how platforms determine whether a user is a minor.

Platforms may rely on:

  • Self-declared age

  • Family-account metadata

  • Parental controls

There is no requirement to verify age for either the parent or the child. As a result, the operative legal standard remains implied knowledge, unless the platform has actual knowledge that a user is under 17. There is no obligation to implement age-assurance technology.

 

The Kids Off Social Media Act (KOSMA)

The Kids Off Social Media Act (KOSMA) takes a different approach. The proposal would prohibit social media accounts for children under 13 and restrict algorithmic recommendation feeds for users under 17. However, KOSMA does not introduce a duty to verify age.

Platforms may rely on:

  • Self-declared dates of birth

  • Family-account structures

  • Parental attestations.

The legal standard is therefore actual knowledge that a user is underage. If a platform becomes aware that a user is under the permitted age, it must act. Otherwise, no age determination requirement exists.

KOSMA also contains a companion provision linking federal telecommunications subsidies to schools and libraries blocking access to social media services. These filtering obligations apply regardless of the user’s age, meaning no age determination is required.

 

COPPA 2.0

COPPA 2.0 would update the long-standing Children’s Online Privacy Protection Act. The bill proposes to extend protections from children under 13 to minors under 17 and introduce stronger privacy rights for teen users aged 13–16. However, COPPA’s core mechanism — verifiable parental consent — remains unchanged. These consent mechanisms aim to confirm parental authority, not parental age. A parent can complete verification steps without proving they are an adult.

As a result:

  • The child’s age is usually self-declared

  • The knowledge standard remains actual knowledge

  • There is no requirement for age verification.

 

 

Sammy’s Law

Sammy’s Law (HR 2657) focuses on the impact of recommendation algorithms on minors. The bill would require large online platforms to limit or disable algorithmic recommendation systems for minors, unless the user or their parent actively enables them. Under the proposal minors would see chronological or non-algorithmic feeds by default and platforms could not automatically recommend content based on engagement signals.

Sammy’s Law does not introduce any age-verification requirement. It assumes platforms already know whether a user is a minor, typically through existing account information or parental controls.

 

AI Systems and Chatbots

The current federal proposals do not yet create a dedicated regulatory framework for AI systems or conversational chatbots. However, AI services could still fall within parts of the emerging framework. AI applications distributed through mobile apps would likely be affected by the App Store Accountability Act, because app stores would provide developers with the user’s age category signal.

Conversely, the KIDS Act focuses narrowly on platforms where a substantial portion of material is harmful to minors. Most AI services would therefore fall outside its scope unless they were specifically designed to distribute adult content.

This leaves an emerging policy gap. As conversational AI becomes increasingly capable of producing personalised and interactive content, future legislation is likely to focus more directly on how age assurance should apply to AI chatbots and generative systems.

 

 

Key Takeaways

Taken together, these proposals reveal the full spectrum of federal thinking on age assurance.

Only a small number of bills currently introduce verified-knowledge obligations, notably:

  • the SCREEN Act

  • the App Store Accountability Act (for adult account holders).

Most other proposals rely on self-declaration, parental attestation or platform metadata, with duties triggered only when a service has actual knowledge that a user is underage.

At the same time, the introduction of the term “technology verification measures” marks an important development. It signals that federal lawmakers are beginning to formally recognise age-assurance technologies as a legitimate regulatory tool. For policymakers, platforms and parents, the distinction between verified knowledge and self-declared age is critical. Without reliable age assurance, many safety obligations depend almost entirely on what users claim about their age rather than what platforms actually know or should know.

PLEASE NOTE This page summarises current law and proposals and does not constitute legal advice. Always consult independent legal advisers before making compliance decisions.