Photo illustration of the Supreme Court building with gavels behind.

NetChoice decision is already influencing how courts consider social media laws

It’s only been a couple of weeks since the Supreme Court issued its opinion on a pair of social media cases this term, but it’s already making a splash in arguments over other tech laws being challenged by the industry.

Lawyers and judges for the Ninth Circuit Court of Appeals in California repeatedly referenced the Supreme Court’s opinion in Moody v. NetChoice and NetChoice v. Paxton during oral arguments in two different cases on Wednesday. The cases before the Ninth Circuit, NetChoice v. Bonta and X v. Bonta, concern two different California laws regarding kids online safety and tech company disclosures (Rob Bonta is California’s attorney general, charged with enforcing the laws).

The arguments gave an early look into how the SCOTUS decision could impact how courts across the country consider what kinds of tech legislation are constitutional and what kinds might unduly impact speech. Though the Supreme Court did not rule on the merits of the Texas and Florida laws at issue in the NetChoice cases — which generally sought to prevent online platforms from discriminating against content of different viewpoints, stemming from many conservatives’ concerns of online censorship — the majority opinion did lay out a roadmap for how the justices view the First Amendment’s applicability to content moderation. In general, the justices said that compiling and curating content is an expressive act, one protected by the Constitution.

Experts predicted after the ruling that it would have a wide impact on the scope of tech regulations across the country, including on topics like kids online safety and transparency, which are the core themes at issue in the laws before the appeals court this week. In the first case, NetChoice v. Bonta, the panel of judges grappled with what the SCOTUS decision would mean for how closely they needed to parse the text of the statute at issue: the California Age-Appropriate Design Code. The law requires online platforms likely to be accessed by kids to install the highest level of privacy settings by default, assess how their features could harm young users, and develop a “timed plan” to mitigate those risks.

Fresh difficulties in litigating ‘facial challenges’

NetChoice, the tech industry group that brought both of the challenges that ended up before SCOTUS and also challenged the California law, had brought its cases as facial challenges. That means it was arguing the laws are unconstitutional in any scenario, rather than in limited applications of the statutes.

The justices in Moody said the lower courts in the Texas and Florida cases failed to perform the necessary analysis for that kind of challenge and made clear that there’s a relatively high bar for determining a law is facially unconstitutional in this way. Attorney Kristin Liska, arguing on behalf of the California attorney general before the Ninth Circuit, pointed out that NetChoice brought a facial challenge to the Age-Appropriate Design Code, too, “and Moody is clear that when analyzing a facial challenge, the question is, do the unconstitutional applications substantially outweigh the constitutional?”

Robert Corn‐​Revere, arguing on behalf of NetChoice, said that the Supreme Court’s decision doesn’t impact its facial challenge in the California case. “I think it confirms that facial challenges in the First Amendment context are allowable when a substantial number of the applications of the law are unconstitutional compared to its plainly legitimate sweep,” Corn-Revere said.

The judges seemed to wrestle with how to figure out how much of the law was constitutional or not and whether any pieces could be salvaged if some parts were struck down. In particular, the judges asked whether the law could survive without the provision requiring tech companies to provide a Data Protection Impact Assessment (DPIA), which would require platforms to create reports on how their product designs or features might harm kids. The district court, which granted a preliminary injunction blocking the law, said the DPIA would likely “trigger First Amendment scrutiny.”

“The judges seemed to wrestle with how to figure out how much of the law was constitutional or not.”

The judges appeared to want to understand how to analyze the rest of the law if they agreed the DPIA requirement was likely unconstitutional. They asked about how to consider whether the DPIA could be severed from the other provisions in the statute, versus how to do a facial analysis of whether the law would always be unconstitutional.

Judge Milan Smith asked if the court could “just attack or deal with that one portion of it, and not deal with the other,” should it disagree with how the district court analyzed the law as a whole. Judge Anthony Johnstone seemed open to the idea that the part of the law requiring platforms to estimate the age of its users could be constitutional. “There’s no legitimate sweep to asking a company to estimate the age of its consumers for purposes of non-content-based safety regulations?” Johnstone asked NetChoice’s Corn-Revere.

“There’s no legitimate sweep to asking a company to estimate the age of its consumers for purposes of non-content-based safety regulations?”

“All of these regulations are tied to the content-based determination whether or not this is safe,” Corn-Revere answered. He said that the factors of the DPIA determine “why you impose the age determination” in the first place.

Smith said he thinks that leaves the panel “with the same problem the Supreme Court dealt with in Moody” because besides the DPIA, he said, the rest of the law needs to be analyzed on a case-by-case basis of how it would apply to different kinds of platforms.

Corn-Revere requested the chance for additional briefings “if the court is inclined to have doubts about whether or not this is subject to a facial overbreadth challenge,” since the SCOTUS decision came out after the California district court blocked the Age-Appropriate Design Code.

When the government compels commercial speech

In the second set of arguments in X v. Bonta, lawyers argued over the validity of California’s AB 587, a bill that requires social media companies to submit reports to the state AG about its terms of service and content moderation policies. In that case, the judges asked about how the Supreme Court’s discussion of a different precedent in the Moody decision — Zauderer v. Office of Disciplinary Counsel — would apply. Zauderer says that the government can compel commercial speech in the form of disclosures “as long as disclosure requirements are reasonably related to the State’s interest in preventing deception of consumers.”

When it comes to disclosure requirements, Johnstone asked, “Why would we welcome a circuit split on something where it seems like Florida, Texas, and California are all agreed on and the Supreme Court has left alone?”

“Why would we welcome a circuit split on something where it seems like Florida, Texas, and California are all agreed on and the Supreme Court has left alone?”

X’s attorney, Joel Kurtzberg, said that ignores the “main distinction” between the California law and those in Texas and Florida, which he said involves specific controversial categories of content. Even if that part was cut out from the law, Kurtzberg said, “there’s very little left” to the requirements.

Kurtzberg also argued that “Zauderer does not apply if the speech is not purely factual and controversial,” and in the case of AB 587, “the law is intended to require disclosures about the most controversial content topics, the decisions that raise the most controversy, and it is also clear that it is designed to pressure the companies to change their policies.”

Leave a Reply

Your email address will not be published. Required fields are marked *