Current Region:
Global

AVPA’s response to CDT, EPIC and Fairplay’s letter to the FTC about COPPA

March 19, 2026

A recent letter to the Federal Trade Commission from Center for Digital Democracy (CDD), Electronic Privacy Information Center (EPIC) and Fairplay raises a number of concerns about its COPPA Enforcement Policy Statement on age assurance. It is a thoughtful intervention and one that reflects a shared objective – ensuring that children are protected online without creating unnecessary risks to privacy.

From the perspective of the Age Verification Providers Association, there is meaningful common ground with the letter. We agree that age assurance systems must be built to the highest privacy and security standards. This is not optional. It is foundational. International standards such as ISO/IEC 27566-1 and IEEE 2089.1 already embed principles of data minimisation, purpose limitation, security by design and strict controls on retention. Properly implemented, age assurance should not require the ongoing storage of personal data and it is designed so that identifying information is deleted immediately after age has been established.

We also agree that the FTC policy statement would benefit from greater clarity and precision. Not all age assurance methods are the same. Age estimation, document-based verification and account-based inference each involve different levels of certainty, different data inputs and different privacy profiles. Treating them as a single category risks both over-regulating low-risk approaches and under-specifying requirements for higher-risk ones. A more explicit, risk-based framework, based on international standards, would improve both compliance and outcomes.

Similarly, the letter is right to call for stronger articulation of accuracy, auditability and governance. These are not peripheral issues. They are central to whether age assurance systems are effective and fair. International standards (ISO 27566-1, IEEE 2089.1) already provide a structured basis for testing, benchmarking and auditing performance, including across demographic groups, and these should be more clearly reflected in regulatory guidance.

However, there are also areas where we would take a different view.

The FTC’s policy statement is, at its core, an attempt to resolve a long-standing problem. For many years, operators have relied on self-declaration of age because of uncertainty about how COPPA applies to more robust methods. The result has been widespread under-enforcement and continued access by children to services that are not designed for them. The FTC is now signalling that operators will not be penalised for deploying more accurate age assurance, provided they do so responsibly. That direction of travel is correct.

It would be a mistake to characterise age assurance as inherently incompatible with privacy or as a departure from COPPA’s objectives. Modern age assurance systems are specifically designed to reduce data exposure. In many cases, the service relying on the check receives only a simple age result, not identity information or underlying evidence. Independent third-party providers can further strengthen this model by separating the age check from the content service, ensuring that no single party has full visibility of both identity and behaviour.

We would also caution against framing third-party providers primarily as a risk. With appropriate standards, certification and oversight, they are a key part of the privacy solution. The issue is not whether third parties are used, but whether they operate within a clear, enforceable framework that guarantees data minimisation, security and accountability.

In short, we support the letter’s call for stronger safeguards, clearer definitions and more robust oversight. These are all necessary. But we do not share any implication that the FTC should step back from encouraging the use of effective age assurance. The real risk lies in maintaining a status quo where weak self-declared age gates persist because of regulatory uncertainty.

The way forward is not to choose between privacy and protection. It is to insist on both. That means a standards-based approach, clear regulatory expectations and the continued development of privacy-preserving technologies that allow services to know whether a user is a child without needing to know who they are.

That is the model AVPA has consistently advocated and it is the one that can deliver meaningful protection for children while respecting the rights of all users.