Dr. Vivek Murthy, President Joe Biden’s surgeon general, renewed the administration’s attack on coronavirus misinformation Sunday, two days after The New York Times reported that Facebook had shelved a study showing that its most-viewed link during the first three months of the year was to an article that suggested a link between a COVID-19 vaccine and a Florida doctor’s death.
“The speed, scale and sophistication with which it is spreading and impacting our health is really unprecedented,” Murthy said of coronavirus misinformation during an appearance on CNN on Sunday. “And it’s happening largely, in part, aided and abetted by social media platforms.”
The Biden administration has aggressively and publicly pressured social media companies such as Facebook to share more data about false and misleading information on the site, and to tamp down its spread. Biden at one point accused Facebook of “killing people” by allowing false information to circulate widely, before later softening his position.
For his part, Murthy has issued a formal advisory in which he declared misinformation “an urgent threat” to public health.
Facebook — which has pushed back by publicly accusing the White House of scapegoating the company — this week released its first quarterly report about the most viewed posts in the United States for the quarter that includes April, May and June.
But only after The Times reported Friday that the company had prepared a similar report for the first three months of the year did the company produce that initial report.
The report showed that the most viewed link on the platform was a news story with a headline suggesting that a coronavirus vaccine was at fault for the death of a Florida doctor. Misinformation peddlers used the article to question the safety of the COVID-19 vaccines on Facebook. It also revealed that a Facebook page for The Epoch Times, which routinely spreads misinformation, was among the 20 most popular pages on the social network.
Murthy’s remarks on the issue of misinformation and its spread came after he was asked about reports of people taking an anti-parasite drug in order to treat COVID-19. “It is costing us in terms of people’s health,” he said.
Asked specifically about Facebook having disclosed the popularity of the news article that was seen to reduce confidence in the coronavirus vaccines, Murthy said it reinforced the fact that “there is a lot of misinformation circulating on these sites.”
“I will readily say that the sites have recognized that this is a challenge, and they’ve stepped up to do some things to reduce the spread of misinformation. And I credit them for that,” he said. “But it’s not nearly enough.”
“There are people who are superspreaders of misinformation,” he added. “And there are algorithms, still, which continue to serve up more and more misinformation to people who encounter it the first time. These are things that companies can and must change. And I think they have a moral responsibility to do so quickly and transparently.”
Executives at Facebook, including Mark Zuckerberg, its chief executive, have said the platform has been aggressively removing COVID-19 misinformation since the start of the pandemic.
This article originally appeared in The New York Times.