WASHINGTON — The Supreme Court sidestepped a ruling Thursday on the legal shield that protects internet companies from lawsuits relating to content posted by users in a case concerning allegations that YouTube was liable for suggesting videos promoting violent militant Islam.
In a brief unsigned opinion, the court did not decide the legal question of whether liability protections enshrined in Section 230 of the Communications Decency Act safeguard YouTube’s alleged conduct.
That is because, in a related case involving similar allegations against Twitter, the court ruled unanimously Thursday that such claims could not be brought in the first place under a federal law called the Anti-Terrorism Act. As a result, both the YouTube and the Twitter lawsuits are likely to be dismissed without courts’ needing to address the Section 230 issues.
“This is a huge win for free speech on the internet. The court was asked to undermine Section 230 — and declined,” said Chris Marchese, a lawyer at NetChoice, a trade group for tech corporations.
The YouTube lawsuit accused the company of bearing some responsibility for the killing of Nohemi Gonzalez, an American college student, in the 2015 Paris attacks carried out by the Islamic State terrorist group.
In the Twitter case, the company was accused of aiding and abetting the spread of militant Islamist ideology in way that contributed to the death of a Jordanian citizen in a terrorist attack.
The justices found in that case that relatives of Nawras Alassaf, who was killed in Istanbul in 2017, cannot pursue claims that Twitter, Google and Facebook were liable for aiding and abetting the attack under the Anti-Terrorism Act. Because of that decision, Gonzalez’s family is unlikely to be able to pursue its claim.
As a result, there is no need for courts to address the Section 230 immunity question.
The unsigned decision said the allegations were “materially identical to those at issue” in the Twitter case. As a result of that ruling, “it appears to follow that the complaint here likewise fails to state a claim,” the court said.
“We therefore decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief,” the court added.
Hannah DeLaine Prado, the general counsel for YouTube owner Google, said in a statement that the various entities that backed Section 230 would be “reassured by this result.”
Eric Schnapper, a lawyer for the plaintiffs in both cases, declined to comment.
The tech industry is closely watching the YouTube case because recommendations are now the norm for online services in general, not just YouTube. Platforms such as Instagram, TikTok, Facebook and Twitter long ago began to rely on recommendation engines or algorithms to decide what people see most of the time, rather than emphasize chronological feeds.
Potential reform of Section 230 is one area in which President Joe Biden and some of his most ardent Republican critics are in agreement, although they disagree on why and how it should be done.
Conservatives generally claim that companies are inappropriately censoring content, while liberals say social media companies are spreading dangerous right-wing rhetoric and not doing enough to stop it. Although the Supreme Court has a 6-3 conservative majority, it had not been clear how it would approach the issue.
Gonzalez, 23, was studying in France when she was killed while dining at a restaurant during the wave of terrorist attacks carried out by ISIS.
Her family alleges that YouTube helped ISIS spread its message. The lawsuit targets YouTube’s use of algorithms to suggest videos for users based on content they have previously viewed. YouTube’s active role goes beyond the kind of conduct Congress intended to protect with Section 230, the family’s lawyers allege.
The family filed the lawsuit in 2016 in federal court in Northern California, and it hopes to pursue claims that YouTube violated the Anti-Terrorism Act, which allows people to sue people or entities who “aid and abet” terrorist acts.
Citing Section 230, a federal judge dismissed the lawsuit. The San Francisco-based 9th U.S. Circuit Court of Appeals upheld the decision in a June 2021 decision that also resolved similar cases that families of other terrorist attack victims had brought against tech companies — including the Twitter dispute.
In the Twitter case, Alassaf was visiting Istanbul with his wife when he and 38 other people were killed by ISIS-affiliated Abdulkadir Masharipov in the Reina nightclub. Masharipov had created a “martyrdom” video saying he was inspired by ISIS and wished to die in a suicide attack. He evaded capture after the shootings but was later arrested and convicted.
Alassaf’s family asserted that without the active assistance of Twitter, Facebook and Google, ISIS’ message and related recruiting efforts would not have spread so widely. The family does not allege that Twitter actively sought to aid ISIS.
A federal judge had dismissed the lawsuit, but the 9th Circuit in the same decision addressing the YouTube case said that the aiding and abetting claim could move forward. The family adequately alleged that the companies had provided substantial assistance to ISIS, the court concluded.
Justice Clarence Thomas wrote in Thursday’s ruling that the family’s “allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack.”
Thomas said aiding and abetting must constitute “a conscious, voluntary, and culpable participation in another’s wrongdoing,” which the plaintiffs had failed to show.
Twitter’s lawyers argued that it provides the same generic services for all its users and actively tries to prevent terrorists from using them. A ruling against the company could have allowed lawsuits against many entities that provide widely available goods or services, including humanitarian groups, the lawyers said.
David Greene, the civil liberties director at the Electronic Frontier Foundation, a group that advocates for free speech online, welcomed the Twitter ruling, saying the court correctly found that “an online service cannot be liable for terrorist attacks merely because their services are generally used by terrorist organizations the same way they are used by millions of organizations around the globe.”
Twitter’s media office, as is customary, automatically replied to a request for comment with a poop emoji.