The Biden Administration Is Pushing Social Media Platforms To Expand Their Definition of Intolerable COVID-19 ‘Misinformation’
A New York Times story about the “rift” between Facebook and the Biden administration regarding COVID-19 “misinformation” illustrates the fuzziness of that category and the perils of suppressing it at the government’s behest. While administration officials often claim they are just encouraging the social media platform to enforce its own rules, their idea of misinformation is not necessarily the same as Facebook’s, and that cleavage shows that the government is imposing online censorship by proxy, pushing to expand the definition of intolerable speech.
“We’ve engaged with Facebook since the transition on this issue,” White House spokesman Mike Gwin tells the Times, “and we’ve made clear to them when they haven’t lived up to our, or their own, standards and have actively elevated content on their platforms that misleads the American people.” Since the Biden administration has the power to make life difficult for social media companies by pursuing litigation, writing regulations, and supporting new legislation, Facebook et al. have a strong incentive to follow the government’s “standards” rather than its own.
“Facebook told White House officials that it grappled with content that wasn’t explicitly false, such as posts that cast doubt about vaccines but don’t clearly violate the social network’s rules on health misinformation,” the Times says. “Facebook allows people to express their experiences with vaccines, such as pain or side effects after receiving a shot, as long as they don’t explicitly endorse falsehoods.”
D.J. Patil, the chief technology officer for Biden’s transition team, had no patience with that distinction. “Seriously?” Patil texted “the Biden team” during one video call. “We have to get past the talking points. People are literally dying.” That conviction culminated in Biden’s July 16 charge that Facebook et al. are “killing people” by failing to police speech the way he thinks they should.
Surgeon General Vivek Murthy, in his July 15 advisory calling for a “whole-of-society” effort to combat the “urgent threat to public health” posed by “health misinformation,” made it clear that the administration expects social media platforms to suppress statements that it deems “misleading,” even when they might be true. “Claims can be highly misleading and harmful even if the science on an issue isn’t yet settled,” he said.
If a Facebook user says we don’t yet have data on the long-term side effects of COVID-19 vaccines, for example, that would be true but unhelpful and therefore probably would count as “misleading” in Murthy’s book. Likewise if someone emphasizes that the vaccines have not yet been fully approved by the Food and Drug Administration.
A Times story about “virus misinformation” further illustrates the point. According to Zignal Labs, which provided the data underlying the paper’s report that “coronavirus misinformation has spiked online in recent weeks,” that category includes claims that “vaccines don’t work,” that “they con
Article from Latest – Reason.com