Did YouTube algorithm spur ISIS killing? Supreme Court weighs Google’s liability in terror case over key internet law Section 230

ISIS victim Nehemi Gonzalez’s family says Google-owned YouTube aided and abetted the Islamic State group by recommending its videos to viewers most likely to be interested in them.

SHARE Did YouTube algorithm spur ISIS killing? Supreme Court weighs Google’s liability in terror case over key internet law Section 230
Beatriz Gonzalez (right), the mother of 23-year-old Nohemi Gonzalez, a student killed in the Paris terrorist attacks, and stepfather Jose Hernandez, speak outside the Supreme Court on Tuesday.

Beatriz Gonzalez (right), the mother of 23-year-old Nohemi Gonzalez, a student killed in the Paris terrorist attacks, and stepfather Jose Hernandez, speak outside the Supreme Court on Tuesday.

Alex Brandon / AP

WASHINGTON — In its first case testing the federal law that’s credited with helping create the modern internet, the Supreme Court seemed unlikely Tuesday to side with a family wanting to hold Google liable for the death of their daughter in an ISIS terrorist attack.

But the justices also signaled during arguments lasting two and a half hours that they are wary of Google’s assertions that a 1996 law— Section 230 of the Communications Decency Act — affords the tech giant, Twitter, Facebook and other companies far-reaching immunity from lawsuits over their targeted recommendations of videos, documents and other content.

The case highlighted the tension between technology policy fashioned a generation ago and the reach of social media today, with billions of posts each day.

“We really don’t know about these things,” Justice Elena Kagan said of herself and her colleagues, several who smiled at the description. “You know, these are not like the nine greatest experts on the internet.”

Congress, not the court, should make any changes to a law passed early in the internet age, Kagan said.

Justice Brett Kavanaugh, one of six conservatives on the court, agreed with his liberal colleague in a case that seemed to cut across ideological lines.

“Isn’t it better,” Kavanaugh asked, to keep things the way they are and “put the burden on Congress to change that?”

The case stems from the death of Nohemi Gonzalez, a college student from California, in a terrorist attack in Paris in 2015.

Members of her family were in the courtroom to listen to arguments about whether they can sue Google-owned YouTube for helping the Islamic State spread its message and attract new recruits in violation of the Anti-Terrorism Act. Lower courts sided with Google.

The justices used a variety of examples to probe what YouTube does when it uses computer algorithms to recommend videos to viewers, whether it’s for content produced by terrorists or by cat lovers. Chief Justice John Roberts suggested what YouTube is doing isn’t “pitching something in particular to the person who’s made the request” but a “21st century version” of what has been taking place for a long time, putting together a group of things the person might want to look at.

Justice Clarence Thomas asked whether YouTube uses the same algorithm to recommend rice pilaf recipes and terrorist content. Yes, he was told.

Kagan noted that “every time anybody looks at anything on the internet, there is an algorithm involved,” whether it’s on Google, YouTube or Twitter. She asked Gonzalez family lawyer Eric Schnapper whether agreeing with him would make Section 230 meaningless.

Lower courts have broadly interpreted Section 230 to protect the industry, which the companies and their allies say has fueled the meteoric growth of the internet by protecting businesses from lawsuits over posts by users and encouraging the removal of harmful content.

But critics argue that the companies haven’t done nearly enough to police and moderate content and that the law should not block lawsuits over the recommendations that point viewers to more material that interests them and keeps them online longer.

Any narrowing of their immunity could have dramatic consequences that could affect every corner of the internet because websites use algorithms to sort and filter a mountain of data.

Lisa Blatt, representing Google, told the court that recommendations are just a way of organizing all that information. YouTube users watch a billion hours of videos daily and upload 500 hours of videos every minute, Blatt said.

Roberts, though, was among several justices who questioned Blatt about whether YouTube should have the same legal protection for its recommendations as for hosting videos.

“They appear pursuant to the algorithms that your clients have,” Roberts said. “And those algorithms must be targeted to something. And that targeting, I think, is fairly called a recommendation, and that is Google’s. That’s not the provider of the underlying information.”

Reflecting the complexity of the issue and the court’s seeming caution, Justice Neil Gorsuch suggested another factor in recommendations made by YouTube and others, noting that ”most algorithms are designed these days to maximize profits.”

Gorsuch suggested that the court could send the case back to a lower court without weighing in on the extent of Google’s legal protections. He participated in arguments by phone because he was “a little under the weather,” Roberts said.

Several other justices indicated that arguments in a related case Wednesday might provide an avenue for avoiding the difficult questions raised Tuesday.

The court will hear about another terrorist attack, at a nightclub in Istanbul in 2017 that killed 39 people and prompted a lawsuit against Twitter, Facebook and Google.

Separate challenges to social media laws enacted by Republicans in Florida and Texas are pending before the high court, but they would not be argued before the fall or decided until the first half of 2024.

The Latest
Drug penalties are “very unfair,” the former president has said. No, wait, death sentences are OK.
A bill that would’ve banned sales of pot-like delta-8 products sailed through the Illinois Senate, but never made it to the floor of the state House. That means the mind-altering products will be unchecked for yet another summer in Chicago and beyond.
The Chicago Department of Public Health is focused on training city workers and people who live in areas with the highest suicide rates.
Durbin chairs the Senate Judiciary Committee, which is considering a bipartisan bill to enact a press shield law. Durbin is a co-sponsor of the bill, but has yet to schedule it for a markup.
The team’s new manager hasn’t been able to get his players untracked offensively.