Ad for the Museum of Natural History
An ad to vote.

SCOTUS Takes on Online Free Speech Law in Case Critical to California Tech Industry’s Future

Justices hear arguments in case to weaken online freedoms and change how tech does business.

PUBLISHED FEB 22, 2023 8:19 A.M.
Share this:  
Can YouTube be held liable for a deadly terrorist attack if its algorithm recommended ISIS videos?

Can YouTube be held liable for a deadly terrorist attack if its algorithm recommended ISIS videos?   PixieMe / Shutterstock   Shutterstock License

The United States Supreme Court on Tuesday heard oral arguments in a case that could not only upend the way California’s internet technology corporations do business, but also restrict the wide range of viewpoints and opinions available on social media and any other online platform. Based on comments and questions offered by the nine SCOTUS justices, they did not appear ready to take those drastic steps, but until the court hands down its decision likely later in the spring or early summer, the fate of online speech remains up in the air.

The case, Gonzalez v. Google,  hinges on the 27-year-old law known as Section 230, a brief, 26-word subsection of the sweeping 1996 Communications Decency Act, which protects free expression online by shielding internet platforms and users from lawsuits over statements and other content posted by third parties. 

Section 230 has come under attack from both ends of the political spectrum. During the 2020 presidential election campaign, both Donald Trump and Joe Biden called for the law to be repealed (albeit for different reasons). House Speaker Nancy Pelosi  derided Section 230 as a “gift” to the tech industry that “could be a question mark and in jeopardy.” Florida Republican Matt Gaetz, a staunch Trump supporter, pushed for legislation to remove the legal protections offered by Section 230. 

Without Section 230, however, online platforms would be subject to such a relentless, costly and time-consuming onslaught of lawsuits that, even if most of the suits were ultimately dismissed, operating an online forum would become prohibitively expensive—unless those platforms engaged in broad moderation of any view that could be considered controversial or offensive, by anyone.

What Does Gonzalez v. Google Hope to Achieve?

In 2015, ISIS terrorists carried out multiple, coordinated attacks in Paris, France, ultimately killing 130 people and wounding hundreds more. Of those deaths, the terrorists inflicted 89 at the 1,500 seat Bataclan concert hall during a concert by the Palm Desert-based alternative rock band Eagles of Death Metal. The ISIS attackers also set off bombs outside an international soccer game at the 80,000-seat Stade de France.

In other attacks taking place that night—Friday, Nov. 13—ISIS gunmen opened fire on diners at several Paris restaurants and cafés. That is where a 23-year-old Cal State Long Beach senior, studying abroad in France, Nohemi Gonzalez was slain by the terrorists.

Not long after the attacks, a nonprofit Israeli legal group, Shurat HaDin (in English, “Letter of the Law”) contacted Gonzalez's family members, who reside in Whittier, CA. The firm proposed a lawsuit against Google, claiming that the internet search giant’s online video subsidiary, YouTube, pushed ISIS-friendly videos via its algorithm that offers recommendations to users based on other videos they have viewed. The Israeli group specializes in suing tech companies over terrorist-related content. According to a Washington Post report, it has lost most of those cases.

YouTube bans terrorist content, and programs its algorithms to screen it out. But YouTube’s own data showed that for every 10,000 views of all videos on the site, as of 2021, 18 (0.18 percent) are views of banned content that slipped through the cracks. 

Most of that banned content violates policies on child safety, nudity or sexual content. Only about one percent of all violating videos removed by YouTube involve hate, harassment or violent extremism, according to the data summarized by the tech news site CNet.

In the lawsuit, the Gonzalez family and their lawyers argue that YouTube, by recommending ISIS videos, aided ISIS by helping the extremist group to recruit members and by inciting violence. For that reason, the lawsuit says, Google should be held legally accountable for the death of Nohemi Gonzalez.

On June 22, 2021, the U.S. Ninth Circuit Appellate Court—based in San Francisco, but responsible for hearing cases from throughout the western U.S.—dismissed the family’s claims. But SCOTUS agreed to take up the case.

Justices Seem Nervous About Weakening Section 230

At Tuesday’s hearing, the SCOTUS justices, including members of the court’s dominant conservative wing, appeared extremely cautious about the effects of rolling back Section 230 protections. Justice Brett Kavanaugh, a Trump appointee, noted that ​​”hundreds of millions, billions of responses of inquiries on the internet are made every day," and that using arguments presented by the lawyers for the Gonzalez family, “every one of those would be the possibility of a lawsuit.”

Justice Elena Kagan, one of the most liberal justices on the court—who was appointed in 2010 by Democratic President Barack Obama—echoed Kavanaugh’s concerns.

“You are creating a world of lawsuits,” the 62-year-old Justice who is the fourth woman ever to serve on the Supreme Court, stated. “Really, anytime you have content, you also have these presentational and prioritization choices that can be subject to suit.”

Conservative Justice Clarence Thomas, a George H.W. Bush appointee and at 31 years and counting the longest-tenured current justice, has been the most outspoken critic of Section 230 on the court. But even Thomas expressed skepticism that YouTube should be liable for promoting terrorist activity. 

Thomas appeared to doubt that the same algorithm that recommended ISIS videos but is also responsible for recommending videos explaining how to make “pilaf from Uzbekistan,” could be “aiding and abetting” terrorists.

“Are we talking about the neutral application of an algorithm that works generically for pilaf and it also works in a similar way for ISIS videos? Or is there something different?” Thomas asked Eric Schnapper, a lawyer for the Gonzalez family.

Schnapper conceded that the YouTube algorithm was, in fact, “neutral,” but that it didn’t matter because by simply recommending terror videos the site “would be aiding and abetting ISIS.”

University of Michigan Law Professor Leah Litman, co-host of the legal podcast Strict Scrutiny, speculated that “this case could be 9-0 for Google.” Former Biden Administration official and Columbia University Law Professor Tim Wu declared that “Gonzalez will lose,” and marveled at what he characterized as the inept performance by the family’s lawyer.

“Schnapper… was way out of his league and threw away every lifeline thrown to him.  Painful to watch such a nationally important issue be so badly argued,” Wu wrote on his Twitter account.

First of Two Challenges to Section 230

The Gonzalez case, however,  is only one of two major cases to be decided by the court that could end up rolling back Section 230. The second, Taamneh v. Twitter, will get a hearing for oral arguments on Wednesday, Feb. 22. The case stems from a terrorist attack in Istanbul, Turkey, on New Year’s Day of 2017. The attack on partygoers in a popular nightclub killed 39 including Jordanian Citizen Nawras Alassaf. ISIS claimed responsibility for that massacre as well.

Alassaf’s family members, who are United States nationals, sued the social media platforms Twitter and Facebook, along with Google, under the federal Anti-Terrorism Act of 1990, which allows civil penalties against any person or organization that “aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed such an act of international terrorism.” 

In the Taamneh case, however, the family members never alleged that ISIS, or the attacker who carried out the New Year’s atrocity, relied on the online platforms to plan or carry out the attack. Instead, they claim that the internet companies failed to remove ISIS content unless someone directly complained about it. 

By this failure, the lawsuit claims, Twitter and the other platforms were instrumental in the growth and development of ISIS—hosting content promoting the terror group as far back as 2010. The lawsuit also claims that Google actually made money off of the ISIS content by allowing the terror group’s videos to take part in YouTube’s monetization program.

Support California Local

$10 • $25 • $50 • Our Impact
News & Analysis

Breaking news article about a local or state topic.

This article is tagged with:
Related Articles
Elon Musk is now in control of the world's most influential social media outlet. What happens next?
What Elon Musk Really Wants From Twitter
Why does the world's richest human want to be in charge of the world's most influential social media platform?
Google is just one of dozens of tech companies announcing major layoffs in 2022 and 2023.
Silicon Valley Boom and Bust: Why California’s Tech Mecca Always Survives
Layoffs and falling stock prices hit hard, but the Valley always bounces back.
Steven Carrillo, who has pled guilty to the murders of two law enforcement officers.
Facebook and Steven Carrillo: How a Veteran Became a Cop Killer
Confessed slayer of sheriff’s sergeant and federal officer was radicalized by the online extremist movement known as ‘boogaloo.’
Collage of images generated by DALL-E with the prompt “An ink drawing in the style of Ralph Steadman of a group of creatures with human bodies dressed in business attire but with lizard heads, outside Twitter corporate headquarters.”
Eschewing Enshittification
Why things tend to suck and what we're doing about it.
Join Us Today!