Monday, March 18, 2024

Free speech or free rein? How Murthy v. Missouri became a soapbox for misinformation advocacy

  Today the U.S. Supreme Court will hear oral arguments in Murthy v. Missouri, originally filed as Missouri v. Biden. This case is emblematic of broader debates over the role of government in regulating online platforms and the protections afforded by the First Amendment in the context of speech online. In this case, the plaintiffs—the states of Missouri and Louisiana, as well as five social media users—alleged that governmental communication with social media platforms regarding concerns about COVID-19 misinformation and election interference amounted to coercion, violating the First Amendment.

  The First Amendment safeguards free speech by prohibiting government censorship and undue influence on individuals and private sector entities, including social media. Concerns arise when government actions, such as threats or pressure, coercively sway social media companies to remove or censor content. However, far from being coerced into censorship, social media companies have actively sought collaborations with government entities and have organized themselves to share critical information in combating foreign interference in U.S. elections and addressing misinformation. This proactive stance by social media platforms signals a clear demand for information sharing and underscores a collaborative effort to navigate the complexities of moderating content that could harm public welfare.

  Murthy v. Missouri makes its way to the Supreme Court after an extreme decision by Judge Terry Doughty in the U.S. District Court for the Western District of Louisiana. His ruling initially barred a wide range of federal government entities from engaging in any form of communication with social media platforms, particularly targeting issues such as the COVID-19 pandemic, election misinformation, and foreign malign influence threats—any action taken by or at the direction of foreign actors or their proxies. The U.S. Court of Appeals for the 5th Circuit affirmed in part Doughty’s injunction on government communication with the platforms, narrowing its application to the White House, the FBI, the Centers for Disease Control and Prevention (CDC), the Office of the Surgeon General, and the Cybersecurity and Infrastructure Security Agency (CISA). The Supreme Court has paused the injunction until justices can hear arguments on the matter.

  The underlying case was filed before Doughty, a Trump appointee who was assigned 90 percent of the cases in his division and has been a favorite jurist of extreme right-wing interests seeking to strike down progressive policies such as COVID-19 mandates for Head Start child care workers and the Biden administration’s pause on oil and gas leases on federal land; the latter was reversed by the 5th Circuit. In Murthy, the plaintiffs alleged that governmental communication with social media platforms regarding COVID-19 misinformation and election interference violated the First Amendment. Though the government did not threaten any official action, Doughty blocked all communication and information sharing between the platforms and federal agencies, labeling the government efforts to stem mis- and disinformation an “Orwellian ‘Ministry of Truth’.”

  Though the injunction is on hold, it has already created confusion on both the government’s and companies’ sides regarding content moderation collaboration. For example, the U.S. Department of State canceled its monthly meeting with Facebook officials to discuss 2024 election preparations and hacking threats, citing the need for further guidance. The chair of the Senate Select Committee on Intelligence, Sen. Mark Warner (D-VA), highlighted a number of dangers this injunction could pose from a national security standpoint in the amicus brief he filed with the Supreme Court and in his recent interview with the Center for American Progress. Specifically, Warner cautioned that government-platform collaboration is a key tool in combating foreign malign influence operations—whether related to elections or other issues—that would be compromised if the court upholds the bar on communication. This is especially concerning as the swift rise of artificial intelligence is posing an “unprecedented threat” of mis- and disinformation to American democracy in this presidential election year.

The who, what, and how of Murthy v. Missouri


  The state attorneys general of Missouri and Louisiana and five social media users filed the lawsuit. Two of the plaintiffs are notable for their seemingly broader roles in disseminating mis- and disinformation through social media. Jim Hoft is the founder of the Gateway Pundit—a far-right website known for spreading debunked conspiracy theories—which was noted by Facebook in 2019 as a “common misinfo offender” at the center of multiple lawsuits for allegedly engaging in “the deliberate spread of dangerous and inflammatory political disinformation designed to sow distrust in democratic institutions.” Another plaintiff, Jill Hines, is an anti-vaccine advocate who was featured in the debunked film “Vaxxed” and is part of the movement to oppose vaccines, a campaign that social media platforms have been combating for years. Notably, Hoft and Hines have filed a separate lawsuit against Stanford University and internet researchers to prevent them from communicating with the government and social media companies about misinformation on the internet.


  As noted above, the 5th Circuit revised the U.S. District Court injunction but kept in place significant prohibitions on communication between social media companies and the White House, the Office of the Surgeon General, the CDC, CISA, and the FBI. Specifically, the injunction prevented the federal agencies from engaging in two key actions in broad and undefined terms:

  1. They may not “coerce” or “significantly encourage” social media platforms to make content moderation decisions.
  2. They may not “meaningfully control” social media platforms’ content moderation processes.

  By imposing such restrictions, the 5th Circuit’s decision could hamper the government’s ability to collaborate with platforms in identifying and mitigating harmful content that poses a threat to public safety and democracy.


  If the Supreme Court upholds this broad and ill-defined injunction, it could create confusion on both the government’s and companies’ sides regarding content moderation collaboration. This confusion might deter the necessary dialogue to identify and mitigate harmful content, affecting how social media platforms manage content; this may be especially true for disinformation from foreign malign influences who seek to sow division in American society and undermine U.S. democracy, as exemplified in Warner’s amicus brief.

  For example, the injunction could lead to a scenario where federal agencies cannot share critical information with platforms regarding national security threats and election integrity. The injunction’s failure to clearly define what constitutes permissible and impermissible government communications with social media platforms only exacerbates this problem. This ambiguity leaves both government officials and platforms uncertain of the legal boundaries for their interactions.


  Without concrete guidelines, the risk of unintentionally crossing the boundary from lawful information sharing or persuasion to unlawful coercion or encouragement becomes a tangible concern for government officials. This lack of clarity may lead to an overly cautious approach, where government agencies refrain from engaging in any communication that could potentially be misconstrued as coercive or overly influential—even when such communication is intended to serve the public interest—increasing vulnerability to mis- and disinformation and national security threats. Furthermore, if the injunction is upheld, it would set a precedent that may prompt similar constraints on the government’s voluntary interactions with private entities in other sectors, such as financial institutions or critical infrastructure such as utilities.

  About the authors: Nicole Alvarez is a senior policy analyst for technology policy at the Center for American Progress. Devon Ombres is the senior director for courts and legal policy at American Progress.

  This article was published by the Center for American Progress.

No comments:

Post a Comment