Biden administration went too far against Facebook over 'health misinformation'

If the Supreme Court rules for the government in Murthy v. Missouri, it would give the government power to define “misinformation” and require it to be removed from social media. The First Amendment plainly forbids that.

SHARE Biden administration went too far against Facebook over 'health misinformation'
A tree blooms outside of the Supreme Court building.

After officials from the Biden administration convinced Facebook to take down posts with health misinformation, the states of Missouri and Louisiana sued. The case, Murthy v. Missouri, was argued before the Supreme Court this week.

Mariam Zuhaib/AP

When federal officials persistently pressured social media platforms to delete or downgrade posts those officials did not like, a government lawyer told the Supreme Court on Monday, they were merely offering “information” and “advice” to their “partners” in fighting “misinformation.” If the justices accept that characterization, they will be blessing clandestine government censorship of online speech.

The case, Murthy v. Missouri, pits two states and five social media users against federal officials who strongly, repeatedly and angrily demanded that Facebook et al. crack down on speech the government viewed as dangerous to public health, democracy or national security. Some of this “exhortation,” as U.S. Deputy Solicitor General Brian Fletcher described it, happened in public, as when President Joe Biden accused the platforms of “killing people” by allowing users to say things he believed would discourage Americans from being vaccinated against COVID-19.

Surgeon General Vivek Murthy, who echoed that charge in more polite terms, urged a “whole-of-society” effort to combat the “urgent threat to public health” posed by “health misinformation,” which he said might include “legal and regulatory measures.” Other federal officials said holding social media platforms “accountable” could entail antitrust action, new regulations or expansion of their civil liability for user-posted content.

Those public threats were coupled with private communications that came to light only thanks to discovery in this case. As Louisiana Solicitor General J. Benjamin Aguiñaga noted Monday, officials such as Deputy Assistant to the President Rob Flaherty “badger[ed] the platforms 24/7,” demanding that they broaden their content restrictions and enforce them more aggressively.

Columnists bug

Columnists


In-depth political coverage, sports analysis, entertainment reviews and cultural commentary.

Those emails alluded to presidential displeasure and warned that White House officials were “considering our options on what to do” if the platforms failed to fall in line. The platforms responded by changing their policies and practices.

Appeasing the president

Facebook executive Nick Clegg was eager to appease the president. In emails to Murthy, he noted that Facebook had “adjust[ed] policies on what we’re removing"; had deleted pages, groups, and accounts that offended the White House; and would “shortly be expanding our COVID policies to further reduce the spread of potentially harmful content.”

Facebook took those steps, Clegg said in another internal email that Aguiñaga quoted, “because we were under pressure by the administration.” Clegg expressed regret about caving to that pressure, saying, “We shouldn’t have done it.”

According to Fletcher, none of this implicated the First Amendment, because “no threats happened.” He meant that federal officials never explicitly threatened platforms with “adverse government action” while urging suppression of constitutionally protected speech.

That position is hard to reconcile with the Supreme Court’s 1963 decision in Bantam Books v. Sullivan. In that case, the Court held that Rhode Island’s Commission to Encourage Morality in Youth had violated the First Amendment by pressuring book distributors to drop titles it deemed objectionable.

Notably, the commission itself had no enforcement authority, and at least some of the books it flagged did not meet the Supreme Court’s test for obscenity, meaning the distributors were not violating any law by selling them. The Court nevertheless concluded that the commission’s communications, which ostensibly sought voluntary “cooperation” but were “phrased virtually as orders,” were unconstitutional because they aimed to suppress disfavored speech and had that predictable result.

The Biden administration’s social media meddling bears a strong resemblance to that situation. But Fletcher argued that federal officials were simply using “the bully pulpit” to persuade platforms that they had a “responsibility” to curtail dangerous speech.

“Pressuring platforms in back rooms shielded from public view is not using the bully pulpit at all,” Aguiñaga noted. “That’s just being a bully.”

Free Press, an inaptly named organization that aims to promote “positive social change, racial justice and meaningful engagement in public life,” warns that a ruling against the government “could allow social-media platforms to leave up misinformation.” In other words, a ruling for the government would empower it to define “misinformation” and require its removal — something the First Amendment plainly forbids.

Jacob Sullum is a senior editor at Reason magazine.

Send letters to letters@suntimes.com.

The Latest
Notes: The Cubs traded first baseman Garrett Cooper to the Red Sox, and left-hander Justin Steele is taking the next step in his rehab.
The Bears began signing undrafted free agents not long after the end of the NFL draft Saturday.
Poles and the Bears have a four-year window to make an aggressive push for the Super Bowl while Caleb Williams is on a cheap rookie contract.
Everyone’s got their origin story. This is Caleb Williams’.