The federal district court in Austin, Texas has granted an injunction preventing Texas Bill 1181 coming into force on September 1 2023. The Bill required age verification for online pornography, and the publication of defined health warnings.
This judgement provides clear indications of how legislation should be drafted to reduce the risk it is found to be substantively unconstitutional, and how to pre-empt a case for an interim injunction setting aside the law before it comes into force.
We set out below the lessons for lawmakers when drafting age verification Bills, and the teams in the offices of state Attorneys General, tasked with defending such laws in federal courts.
(We will ignore the debate about health warnings as this was specific to the Texas Bill; but it should be noted that this aspect of the Bill proved hard to defend and is not recommended for inclusion in future AV legislation to avoid opening up a second line of attack from opponents).
In general, it would also help to be seen to be evidence-based when developing new legislation, and that may require a call for evidence, consultation and research before laws are presented to legislatures.
The Substantive Argument
The judge helpfully sets out the structure of the arguments on the merits of the challenge:
- do Plaintiffs have standing to bring suit,
- is the age verification requirement unconstitutional,
- is the health warning unconstitutional, and
- does Section 230 of the CDA preempt the law?
Only 1, 2 and 4 are generally relevant so we set out the lessons under each heading below.
To have Article III standing, a plaintiff must “
(1) have suffered an injury in fact,
(2) that is fairly traceable to the challenged action of the defendant, and
(3) that will likely be redressed by a favorable decision.”
In general arguing against the plaintiff’s having standing does not appear to be fruitful:
Injury in fact
If the law is challenged by an Association, at least one member of it must have standing. We learn here that this need only be demonstrated in the underlying complaint, not the request for an injunction.
“The First Amendment protects speech for the sake of both the speaker and the recipient.” Thunder Studios, Inc. v. Kazal, 13 F.4th 736, 743–44 (9th Cir. 2021), cert. denied 142 S. Ct. 1674 (2022).
Foreign websites are not deprived of constitutional rights in this case because (i) there is no attempt to enforce the law with extraterritorial effect and (ii) “their speech and conduct occurs in Texas”. The latter point is reinforced by the fact free speech rights age given to the recipient as well as the speaker.
Lesson: Laws should not limit themselves to enforcement within the USA but should include mechanisms to deliver extraterritorial enforcement, as this may limit the standing of overseas plaintiffs.
This has been achieved in the UK’s Online Safety Act by giving regulators the power to block UK users from access to foreign sites and to require the withdrawal of business services such as hosting, payments and advertising by those providers who are subject to UK law.
However in general websites at home and abroad will not struggle to achieve the standing required to bring a challenge.
Traceability and Redressability
The judge hardly mentions this as it is accepted by both sides.
The judge, as an aside, confirms that because the state is involved in enforcement, then Article 14 does not give it immunity.
Lesson: Bills could remove the state from any enforcement role and rely on private right of action – but this has significant practical disadvantages so we do not recommend this
A recent legal device to evade constitutional challenges, invented to allow for laws that restrict abortions, is to put enforcement in the hands of civilians not the state. But this removes any ability for the state to make Rules to facilitate effective implementation. Questions as to whether a platform has implemented age verification effectively will be determined by individual judges and juries who will generally lack the technical qualifications to do so. This will lead to uncertainty as to the legal requirement.
Is age verification unconstitutional?
This is the core of the argument:
Just like COPA, H.B. 1181 regulates beyond obscene materials. As a result, the regulation is based on whether content contains sexual material. Because the law restricts access to speech based on the material’s content, it is subject to strict scrutiny. Id.; Ent. Software Ass’n v. Blagojevich, 469 F.3d 641, 649–50 (7th Cir. 2006)
There is extensive discussion of the history of laws to control pornography. The judge concludes that the Age Verification Requirement is Subject to Strict Scrutiny.
The key issue is the definition of “obscene” content which can be regulated as it falls outside the scope of First Amendment protection, according to Ginsberg v. State of New York in 1968 .
But in Reno v. ACLU in 1997, the latest Texas judgement explains that the Supreme Court refined this view when applied to a “‘a content-based regulation that extended far beyond obscene materials and into First Amendment protected speech, especially because the statute contained no exemption for socially important materials for minors.’ according to the judgement in Miller (Miller v. California, 413 U.S. 15, 24 (1973)”.
Moreover, even if we were to abandon Miller, the law would still cover First Amendment protected speech. H.B. 1181 does not regulate obscene content, it regulates all content that is prurient, offensive, and without value to minors. Because most sexual content is offensive to young minors, the law covers virtually all salacious material. This includes sexual, but non-pornographic, content posted or created by Plaintiffs. See (Craveiro-Romão Decl., Dkt. # 28-6, at 2; Seifert Decl., Dkt. # 28-7, at 2; Andreou Decl., Dkt. # 28-8, at 2). And it includes Plaintiffs’ content that is sexually explicit and arousing, but that a jury would not consider “patently offensive” to adults, using community standards and in the context of online webpages. (Id.);
The Child Online Protection Act (COPA) of 1998 which “restricted the ability to post content online that was harmful to minors for commercial purposes” was deemed subject to strict scrutiny because its “definition of harmful material is explicitly focused on minors, it automatically impacts non obscene, sexually suggestive speech that is otherwise protected for adults” – and was struck down on Appeal as a result.
Judge Ezra concludes HB 1181 also “regulates beyond obscene materials” and “Because the law restricts access to speech based on the material’s content, it is subject to strict scrutiny”
Lesson: The scope of the law should be confined to requiring age verification to protect children of a particular age from what would be considered obscene for that age group
The Supreme Court held that a law that “applies broadly to the entire universe of cyberspace” and seeks to protect children from offensive speech “is a content-based blanket restriction on speech, and, as such, cannot be ‘properly analyzed as a form of time, place, and manner regulation.’” ACLU v. Reno, 521 U.S. at 868 (quoting Renton v. Playtime Theatres, 475 U.S. 41, 46 (1986))
Lesson: Laws should target specific pages of obscene content, not entire websites with age verification completed on the landing site and all content therefore restricted.
Because the law regulates speech based upon the content therein, including content deserving of First Amendment protection, it must survive strict scrutiny. To endure strict scrutiny, H.B. 1181 must:
(1) serve a compelling governmental interest
(2) be narrowly tailored to achieve it, and
(3) be the least restrictive means of advancing it.
Sable Commc’ns of Cal., Inc. v. Fed. Commc’ns Comm’n, 492 U.S. 115, 126 (1989).
Again, if the drafting can avoid the law being considered as regulating content beyond obscene content, these issues do not arise. But for completeness, we will look at the lessons from each element in turn:
serve a compelling governmental interest
It is uncontested that pornography is generally inappropriate for children, and the state may regulate a minor’s access to pornography. [Ginsberg, 390 U.S. at 63.]
The strength of that interest alone, however, is not enough for a law to survive strict scrutiny. The state must still show that H.B. 1181 is narrowly tailored and the least restrictive means of advancing that interest.
narrowly tailored to achieve it
the law is severely underinclusive. When a statute is dramatically underinclusive, that is a red flag that it pursues forbidden viewpoint discrimination under false auspices, or at a minimum simply does not serve its purported purpose. See City of Ladue v. Gilleo, 512 U.S. 43, 52 (1994).
- Defendant implicitly concedes this when they argue that the foreign Adult Video Company Plaintiffs are not subject to jurisdiction in the United St
- Search engines, for example, do not need to implement age verification, even when they are aware that someone is using their services to view pornography
- social media companies are de facto exempted, because they likely do not distribute at least one-third sexual material
- As the study points out, pop-up ads, not pornographic websites, are the most common forms of sexual material encountered by adolescents.
Paradoxically, the judge is concerned that the law is too narrow in scope and is therefore “underinclusive”. So as a general rule, laws need to address the whole extent of online harm from adult content in all its forms.
Lesson: Laws should address pornography wherever it is visible to children online. So that must include search engines.
Lesson: (repeated) the laws must reach foreign websites as well as domestic, with a “valid enforcement mechanism against those website”
it is unclear whether “one-third” modifies “material” or “website.”
the law offers no guidance as to how to calculate the “one third”— whether it be the number of files, total length, or size.
Lesson: Laws should not use a percentage of content within the definition of scope (see above for advice to target only individual pages with obscene content)
“The type of material that might be considered harmful to a younger minor is vastly different—and encompasses a much greater universe of speech—than material that is harmful to a minor just shy of seventeen years old. . . .” ACLU v. Ashcroft, 322 F.3d at 268.
A website dedicated to sex education for high school seniors, for example, may have to implement age verification measures because that material is “patently offensive” to young minors and lacks educational value for young minors
The judge is also concerned that HB 1181 “provides no guidance as to what age group should be considered for “patently offensive” material, backed later by reference to the Ashcroft case
The term “minor,” as Congress has drafted it, thus applies in a literal sense to an infant, a five-year old, or a person just shy of age seventeen. In abiding by this definition, Web publishers who seek to determine whether their Web sites will run afoul of COPA cannot tell which of these “minors” should be considered in deciding the particular content of their Internet postings. Instead, they must guess at which minor should be considered in determining whether the content of their Web site has “serious … value for [those] minors.”
47 U.S.C. § 231(e)(6)(C).
Lesson: Laws should be draft to account for the difference in maturity between a 7 year-old and a 17 year-old.
Nor does the statute define when material may have educational, cultural, or scientific value “for minors,” which will likewise vary greatly between 5-year olds and 17-year-olds.
Lesson: Laws should exclude content with educational, cultural, or scientific value for the age-group that is viewing it.
The law requires sites to use “any commercially reasonable method that relies on public or private transactional data” but fails to define what “commercially reasonable” means
Although he does not rely on this for the conclusion, the judge is very concerned by the vagueness of the drafting
Lesson: Be specific as to how age verification is defined. We would suggest reference to “methods of age assurance certified to international standards to be designated by the Attorney General / [appointed regulator]”
It may be prudent to list and define carefully a range of approved age verification and age estimation methods within the statute. This is not ideal as it does not allow for innovation, and could permit some methods which become defunct or become unreliable, but may be helpful in explaining to the judicial branch of government how the system will operate in practice.
Plaintiffs are likely to succeed on their overbreadth and narrow tailoring challenge because H.B. 1181 contains provisions largely identical to those twice deemed unconstitutional in COPA.
Overall the judgement concludes:
The law sweeps far beyond obscene material and includes all content offensive to minors, while failing to exempt material that has cultural, scientific, or educational value to adults only
Lesson: While this stretches the author’s abilities, the answer here appears to be that new laws should target individual pieces of content which are obscene to children given their age-group. So, the youngest children should be protected from most sexual material which is obscene for them, while older teenagers can view more adult content with only more extreme material prohibited.
It is hard to predict if a judge would respond to such a graduated law more favorably, but it is no longer such a black and white argument that such a law is too blunt an instrument to achieve reasonable objectives.
be the least restrictive means of advancing it.
Here the judge demonstrates that the evidence about the ineffectiveness of filtering, the distinction between a policy that gives parents discretion and control versus a policy that allows the state to protect all children,
as determined by the facts on the record and presented at the hearing, age verification laws remain overly restrictive. Despite changes to the internet in the past two decades, the Court comes to the same conclusion regarding the efficacy and intrusiveness of age verification as the ACLU courts did in the early 2000s
It appears the judge was not convinced by expert testimony that technology has progressed in the last 20 years. It is hard to answer that in the drafting of a new law, but it may be wise to attempt to do so
Lesson: References to modern technology, such as cryptography, the emergence of facial age estimation and its latest accuracy figures, one-way blind checks of government data, and self-sovereign digital identities through verifiable credentials may help persuade the courts that there is now better technology available to verify age online.
“[The law] will likely deter many adults from accessing restricted content because they are unwilling to provide identification information in order to gain access to content, especially where the information they wish to access is sensitive or controversial. People may fear to transmit their personal information, and may also fear that their personal, identifying information will be collected and stored in the records of various Web sites or providers of adult identification numbers.”
The judgement states “adults must affirmatively identify themselves before accessing controversial material, chilling them from accessing that speech. Whatever changes have been made to the internet since 2004, these privacy concerns have not gone away, and indeed have amplified”
Judge Ezra has formed the impression that in using a government ID to verify age, “the state government can log and track that access”. This would only be the case if an adult site engaged directly with the state to confirm a user’s age when that user attempted to access the site.
This is not how the age verification sector operates, deliberately creating third-parties who not only prevent the original supplier of the proof of age from knowing the purpose of the enquiry, but also answer multiple queries from sites that need to confirm the age-range of a user for many purposes, not only to access adult content. So state records may have been consulted ahead of the purchase of alcohol online, for example.
He also assumes that when state ID is used, the state is aware that this has happened, which, if it is by virtue of an image of a physical identity document such as a drivers’ license is not the case.
Lesson: Laws should prohibit regulated sites from conducting their own age verification and require the use of third party verifiers. Those third party verifiers should be required not to store personal data apart from a date of birth, and not to retain any record of which sites ask to confirm the age of a particular user.
Lesson: Where state records are directly accessed, such as Department of Motor Vehicles records, by third party verifiers, the law should require that no indication is given to the DMV as to the purpose of the enquiry and the DMV should retain no record of that enquiry being made.
Defendant contests this, arguing that the chilling effect will be limited by age verification’s ease and deletion of information. This argument, however, assumes that consumers will (1) know that their data is required to be deleted and (2) trust that companies will actually delete it. Both premises are dubious, and so the speech will be chilled whether or not the deletion occurs. In short, it is the deterrence that creates the injury, not the actual retention
There will be those, perhaps including Judge Ezra, who remain sceptical that records are not kept, but to accept this argument would be for judges to acknowledge that laws are ineffective. That is a slippery slope for the courts, but there may be a need to provide some further reassurance
Lesson: Laws should make provision for independent audit of age verification systems, and their providers, to provide assurance that data security and privacy measures are effective. This can be undertaken directly by the state, or through conformity assessment bodies approved by the state.
Moreover, while the commercial entities (e.g., Plaintiffs) are required to delete the data, that is not true for the data in transmission. In short, any intermediary between the commercial websites and the third-party verifiers will not be required to delete the identifying data.
This is a curious element of the judgement, as it is not clear who the intermediaries would be. The connection is generally a direct one, across the Internet, sending only a request for age verification in one direction and a response of “pass” or “fail” back in return.
There may be some form of words that might offer reassurance but it is hard to be specific.
Lesson: Extend the prohibition on retention of personal data to any party which processes it.
the First Amendment injury is exacerbated by the risk of inadvertent disclosures, leaks, or hacks
This concern is based on a mistaken believe that personal data is stored. It is not so cannot be disclosed, leaked or hacked except in the moment it is being processed to confirm age.
The judge has overlooked the evidence provided by the supplier of the LA Wallet, Envoc, that it was not affected by the cyberattack referred to in the complaint. He refers to the Ashley Madison case, which perhaps more than anything else, influenced the development of age verification using privacy-by-design and data minimisaiton principles at its heart.
Lesson: There is some validity in his point that the perceived risk of data being retained and leaked could have a chilling effect, (notwithstanding that this is a slippery legal slope) so it would be wise to address that as explicitly as possible in any drafting.
Plaintiffs offer several alternatives that would target minor’s access to pornography with fewer burdens on adults’ access to protected sexually explicit materials. First, the government could use internet service providers, or ISPs, to block adult content until the adults opt-out of the block. This prevents the repeated submission of identifying information to a third party, and operating at a higher level, would not need to reveal the specific websites visited. If implemented on a device-level, sexual information would be allowed for adults’ devices but not for children when connected to home internet.
In addition, Plaintiffs propose adult controls on children’s devices, many of which already exist and can be readily set up. This “content filtering” is effectively the modern version of “blocking and filtering software” that the Supreme Court proposed as a viable alternative in Ashcroft v. ACLU. 542 U.S. at 666–73.
Blocking and filtering software is less restrictive because adults may access information without having to identify themselves
Lesson: Be explicit that the purpose of the law is not to give parents the control they need to exercise their discretion in what their children see online, but rather it is to apply restrictions that the state’s democratically elected representatives have determined should apply to all minors (graduated by age-range, see above)
At the hearing, Defendant’s expert repeatedly emphasized that parents often fail to implement parental controls on minors’ devices. But Defendant has not pointed to any measures Texas has taken to educate parents about content filtering
Parental controls are commonplace on devices. They require little effort to set up and are far less restrictive because they do not target adults’ devices
Defendant offers zero evidence that the legislature even considered the law’s tailoring or made any effort whatsoever to choose the least-restrictive measure. To satisfy strict scrutiny, Texas must provide evidence supporting the Legislature’s judgments.
“[W]hile such a less-restrictive-means analysis need not entail the government affirmatively proving that it tried less-restrictive mean . . . it does entail the government giving serious consideration to such less-restrictive means before opting for a particular regulation.”).
Brewer v. City of Albuquerque, 18 F.4th 1205, 1255 (10th Cir. 2021)
The judge is also critical that other measures have not been tested or even considered. Parental controls at the device, router and ISP level have all been available for some time, and have clearly not in practice been effective in preventing children from accessing adult content. This is because parents need knowledge, will, capability and in some cases the financial means to apply these controls. Most do not.
Lesson: When drafting legislation, research should be commissioned to demonstrate the failure of parental controls to meet the legislature’s objectives. Consider a localized trial of a campaign to persuade parents to implement controls and measure the impact on children’s devices from a baseline.
Tony Allen, a digital technology expert who submitted a declaration on behalf of Defendant, suggests several ways that age-verification can be less restrictive and costly than other measures. (Allen Decl., Dkt. # 26-6). For example, he notes that age verification can be easy because websites can track if someone is already verified, so that they do not have to constantly prove verification when someone visits the page. But H.B. 1181 contains no such exception, and on its face, appears to require age verification for each visit. H.B. 1181 § 129B.003
Allen identifies multiple ways that age verification can be less intrusive on users and websites. But H.B. 1181 does not allow these methods.
Lesson: laws should explicitly sanction the re-use of age checks and mechanisms for interoperability between age verification providers
He is also concerned that HN 1811 “does not appear to allow for vouching because it is not based on transactional data.”
Lesson: In defining acceptable methods, laws should ensure that accessibility is addressed in particular through provision for both age estimation techniques and vouching
content filtering also comports with the notion that parents, not the government, should make key decisions on how to raise their children. See United States v. Playboy Ent. Grp., Inc., 529 U.S. 803, 824–25 (2000)
(“A court should not assume a plausible, less restrictive alternative would be ineffective; and a court should not presume parents, given full information, will fail to act.”). Likewise, even as it upheld obscenity laws, Ginsberg affirmed that “constitutional interpretation has consistently recognized that the parents’ claim to authority in their own household to direct the rearing of their children is basic in the structure of our society.”
This is perhaps the hardest point to counter in the drafting of revised legislative proposals. If a parent created a stripclub, brothel, casino or bar in their own home, does that put it beyond the reach of the state to prevent their children frequenting these facilities? If the precedent’s quoted by the judge apply, perhaps it does.
Lesson: Frame any defense of the law or against an injunction in terms of the logical conclusion if parents are given carte blanche for what happens inside their own home.
Parental controls are commonplace on devices. They require little effort to set up and are far less restrictive because they do not target adults’ devices
“content filtering is likely to be more effective because it will place a more comprehensive ban on pornography compared to geography-based age restrictions, which can be circumvented through a virtual private network (“VPN”) or a browser using Tor.”
The laws passed to date do not include any exemption if a child uses a VPN or Tor browser to access adult content from state which requires age verification.
content filtering blocks out pornography from foreign websites, while age verification is only effective as far as the state’s jurisdiction can reach
Does Section 230 of the Communications Decency Act pre-empt the law?
On this point, the judge did not fully agree with the plaintiffs.
Section 230 only covers content created by others and hosted by the website in question. So where an adult site produces its own content, it is not protected from liability.
To the extent that the domestic website Plaintiffs and foreign website Plaintiffs create or develop the content they themselves post, they are not entitled to immunity
There is some discussion as to whether this applies to foreign sites, referring back to the points made about extra-territorial enforcement above. The conclusion is that as these sites may be sued in the USA, they are also protected:
foreign website Plaintiffs may claim the protection of Section 230when failing to do so would subject them to imminent liability for speech that occurs in the United States. Force, 934 F.3d at 74. Because the foreign website Plaintiffs host content provided by other parties, they receive protection under Section 230. MySpace
Lesson: The most prudent definition of scope would target only sites that host their own content, both foreign and domestic, not those which host that produced by others which may be protected by Section 230.
This would at least address the “tube” sites which aggregate huge volumes of adult content produced by others, and are by far the most popular.
While this is not addressed in the Texas judgement, Section 230 may not provide immunity from rules against publishing obscene content so a Bill could, within the guidance set out already above, narrowly target this and survive constitutional scrutiny.
The judgement in FREE SPEECH COALITION, INC., et al. vs COLMENERO provides clear guidance for lawmakers drafting Bills seeking to require age verification for pornography.
Applying these lessons will not guarantee a Bill services a constitutional challenge, and revised Bills may give rise to new lines of objection, but should significantly increase the chances of successfully defending against such complaints.
Addendum – defending against an injunction
Senior District Judge David Ezra helpfully sets out the four tests for an injunction:
A preliminary injunction is an extraordinary remedy, and the decision to grant such relief is to be treated as the exception rather than the rule.
Valley v. Rapides Par. Sch. Bd., 118 F.3d 1047, 1050 (5th Cir. 1997)
A plaintiff seeking a preliminary injunction must establish
- that he is likely to succeed on the merits,
- that he is likely to suffer irreparable harm in the absence of preliminary relief,
- that the balance of equities tips in his favor, and
- that an injunction is in the public interest.”
Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7, 20 (2008).
The party seeking injunctive relief carries the burden of persuasion on all four requirements.
PCI Transp. Inc. v. W. R.R. Co., 418 F.3d 535, 545 (5th Cir. 2005).
So, let us first look at the lessons from the judgement when drafting age verification laws that would have reduced the chances that an injunction is granted even before a case is heard on the constitutionality of the statute itself.
This is dealt with below in the discussion of the Act itself. If the underlying statute is constitutionally robust, a complaint seeking an injunction is far less likely to succeed.
The judge concludes that those making the complaint will suffer if the Bill comes into force but is later overturned:
Plaintiffs’ monetary injuries are nonrecoverable — Defendant does not contend otherwise. And they are more than de minimis, because Plaintiffs will have to find, contract with, and integrate age verification systems into their websites. These services come at substantial cost—at the cheapest around $40,000.00 per 100,000 visits.
(Compl., Dkt. # 1, at 18; Sonnier Decl., Dkt. # 5-2, at 54).
While the judge goes on to emphasize that “the key inquiry is ‘not so much the magnitude but the irreparability,”” he does then quote the excessive costs of an age check quoted in the complaint of 40 cents per user.
Lesson : Be crystal clear when opposing an injunction that age checks cost as little as 4 cents per check, and the check is per user not per visit.
He then summarizes his overall impression of the situation:
A party cannot speak freely when they must first verify the age of each audience member, and this has a particular chilling effect when the identity of audience members is potentially stored by third parties or the government.
Lesson: Be crystal clear in the legislation itself and the defense against an injunction that (i) the identity of audience members need not be disclosed e.g. through facial age estimation and (ii) the identity is not stored by third parties or the government.
The implications of this are that laws should explicitly permit age estimation techniques based on voiceprints or facial images, and also allow behavioral techniques such as game-play analysis.
These methods do not require users to disclose their full identity. Nor does the sample required to conduct the analysis contain sufficient data to re-identify the user uniquely. And with some solutions, the analysis can be conducted on the user’s device, removing the need to transmit any personal data whatsoever to a third party.
The judgement also concludes that (i) it is impossible to quantify the potential losses if the law comes into effect and is later overturned, because “the loss of goodwill and visitors may endure for years beyond this litigation” and “the state is entitled to sovereign immunity from monetary claims”
Lesson: Do not attempt to argue against the case that “monetary losses are significant and nonrecoverable” and “their imminent occurrence constitutes irreparable harm” as this is unanswerable.
Balance of equities
There is then a brief section which addresses the third test:
“[E]nforcement of an unconstitutional law is always contrary to the public interest.” Gordon v. Holder, 721 F.3d 638, 653 (D.C. Cir. 2013). “Injunctions protecting First Amendment freedoms are always in the public interest.” Opulent Life Church v. City of Holly Springs, Miss., 697 F.3d 279, 298 (5th Cir. 2012)(quoting Christian Legal Soc’y v. Walker, 453 F.3d 853, 859 (7th Cir. 2006)).
Lesson: If the judge has been persuaded that the underlying statute is unconstitutional, there is no point in arguing that the balance of harms favors an injunction.
And for completeness the fourth test is covered, but with a conclusion that logically flows from the answer to the third:
Because H.B. 1181 is likely unconstitutional, the state cannot claim an interest in its enforcement