Holding YouTube liable for promoting terrorism is a bad idea

The Supreme Court next month will consider whether Google, which owns YouTube, can be sued for helping the terrorist group ISIS promote its message and attract followers.

SHARE Holding YouTube liable for promoting terrorism is a bad idea
(FILES)This picture taken on January 27, 2010 in Paris shows the internet homepage of the YouTube website. YouTube gave its users five more minutes on July 29, 2010, increasing the upload limit for videos to the site to 15 minutes from 10 minutes. AFP PHOTO/LOIC VENANCE (Photo credit should read LOIC VENANCE/AFP/Getty Images)

Since 1996, federal law has shielded websites from most kinds of civil liability for content posted by users.

AFP/Getty Images

Every day, people around the world post about 720,000 hours of new content on YouTube — 500 hours of video every minute. That enormous volume of material poses challenges for the platform, which aspires to enforce rules against certain kinds of content, and for its users, who cannot hope to navigate the site without help from YouTube’s algorithms, which facilitate searches and recommend videos based on personal viewing patterns.

Those challenges underlie a case that the Supreme Court will hear next month, when it will consider whether Google, which owns YouTube, can be sued for helping the terrorist group ISIS promote its message and attract followers. The case illustrates the hazards of increased civil liability for social media companies, which critics on the right and the left wrongly see as the key to better moderation practices.

Since 1996, federal law has shielded websites from most kinds of civil liability for content posted by users. Under 47 USC 230, “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230 also protects “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” These two kinds of immunity aim to avoid potentially crippling litigation that would impede the availability of user-generated information and deter content moderation, making the internet as we know it impossible.

Columnists bug

Columnists


In-depth political coverage, sports analysis, entertainment reviews and cultural commentary.

In 2021, the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 barred a lawsuit against Google by the family of Nohemi Gonzalez, a 23-year-old U.S. citizen who was killed in a 2015 ISIS attack while studying in Paris. The plaintiffs originally argued that Google was liable under the Anti-Terrorism Act for allowing ISIS videos to remain on YouTube and for increasing exposure to them through its “up next” feature, which suggests videos similar to ones users have watched.

On appeal to the Supreme Court, Gonzalez’s family concedes that Section 230 means Google, which bans YouTube videos “intended to praise, promote, or aid violent extremist or criminal organizations,” cannot be sued for failing to fully enforce that policy. But the plaintiffs argue that the company can be sued for pointing users to such videos when they view similar content, and the Biden administration agrees.

In response, Google notes that “algorithmic tools — from search rankings and content recommendations to email spam-filtering — are indispensable to a functional internet.” Google argues that there is no defensible distinction between YouTube’s “up next” feature and other algorithms that enable internet users to sort through an “unimaginably vast” amount of material to find relevant and useful information.

If YouTube’s “algorithmic tools” expose Google to liability for content it did not create, in other words, every provider of an “interactive computer service” will have to worry about the legal risk of guiding users through a massive morass of material that would otherwise be unmanageable. This is just one facet of a broader problem with making it easier to sue websites over third-party content.

President Joe Biden thinks repealing Section 230 would make it possible to “hold social media platforms accountable for spreading hate and fueling violence.” Republican politicians like Sen. Roger Wicker, R-Miss., meanwhile, complain that Section 230 allows those platforms to discriminate against conservatives with impunity.

Opinion Newsletter

The fact that two sets of critics blame Section 230 for either too little or too much content moderation suggests something is wrong with their reasoning. In reality, the First Amendment protects both “hate speech” and editorial discretion.

Repealing Section 230 would not change that. But the resulting litigation would force platforms, especially those without the resources to battle a flood of lawsuits, to choose between much more heavy-handed content moderation and none at all — a situation that neither Biden nor Wicker would welcome.

Jacob Sullum is a senior editor at Reason magazine.

The Sun-Times welcomes letters to the editor and op-eds. See our guidelines.

The Latest
Ramos gets a sacrifice fly in his first MLB at-bat, adding a single in his first start.
Sheets went 2-for-4 with two doubles against the Cardinals Sunday.
The Cubs shortstop hit his first home run since April 25 on Sunday against the Brewers.
Police say thieves early Saturday made off with lighting equipment, laptops, iPhones and some booze from the Berwyn nightclub. But staff and friends pulled together to make sure that night’s shows went on. The robbery is under investigation.
Counter-protesters at DePaul’s Lincoln Park campus reportedly tried to clash with protesters, but the pro-Palestinian protesters used de-escalation tactics to keep peace. Nationwide, more than 2,500 protesters have been arrested since April 18.