WASHINGTON – In March, as claims about the dangers and ineffectiveness of coronavirus vaccines circulated on social media and undermined efforts to stop the virus from spreading, some Facebook employees thought they had found a way to help.
By changing the ranking of vaccine messages in people’s news feeds, company researchers realized they could cut down on misleading information about COVID-19 vaccines that people have seen and offer users publications from legitimate sources such as the World Health Organization.
“Given these results, I guess we hope to get it up and running as soon as possible,” wrote one Facebook employee in response to an internal research note.
Instead, Facebook shelved some of the research proposals. There were no other changes until April.
When another Facebook researcher suggested turning off comments on vaccine posts in March until the platform could better deal with the vaccine posts lurking in them, the proposal was ignored.
Critics say the reason Facebook was slow to implement these ideas is simple: the tech giant feared it could affect the company’s bottom line.
“Why don’t you delete the comments? Because participation is the only thing that matters, ”said Imran Ahmed, general director of the Center to Combat Digital Hate, an Internet watchdog group. “It gets attention, and attention equals ad revenue.”
In a statement sent by email, Facebook said it has made “significant progress” this year in reducing the level of misinformation about vaccines in user feeds.
Internal Facebook discussions were disclosed in SEC disclosures and redacted by former Facebook employee-turned-informant legal counsel Francis Haugen to Congress. Edited versions received by Congress were obtained by a consortium of news organizations including the Associated Press.
A collection of documents reveals that in the midst of the COVID-19 pandemic, Facebook has carefully researched how its platforms are spreading misinformation about life-saving vaccines. They also show that frontline employees have consistently come up with unsuccessful anti-vaccine countermeasures solutions. The Wall Street Journal reported on some of Facebook’s efforts to tackle anti-vaccination comments last month.
Facebook’s response raises questions about whether the company has prioritized disagreement and disagreement over the health of its users.
“These people are selling fear and resentment,” said Roger McNamy, a Silicon Valley venture capitalist and early Facebook investor who is now a vehement critic. “This is not an accident. This is a business model. “
Typically, Facebook ranks posts by engagement – based on the total number of likes, dislikes, comments, and repeat posts. This ranking scheme can work well for harmless items like recipes, dog photos, or the latest viral songs. But Facebook’s own docs show that when it comes to controversial public health issues like vaccines, engagement rankings only highlight polarization, division, and doubt.
To explore ways to reduce misinformation about vaccines, Facebook researchers reordered posts for more than 6,000 users in the United States, Mexico, Brazil and the Philippines. Instead of seeing messages about vaccines selected based on their popularity, these users saw messages selected based on their reliability.
The results were astounding, with nearly 12% decrease in content in which claims were rebutted by fact-checkers, and an 8% increase in content in established public health organizations such as WHO or the US Centers for Disease Control. These users also saw a 7% decrease in negative site interactions.
Company employees responded to the study with enthusiasm, according to internal reports included in the whistleblower’s documents.
“Is there a reason why we won’t do this?” One Facebook employee wrote in response to an internal memo describing how the platform could curb anti-vaccine content.
Facebook said it has implemented many of the study’s findings, but only by a month, a delay that occurred at a crucial stage in the global rollout of the vaccine.
In a statement, company spokeswoman Dany Lever said internal documents “do not reflect the significant progress we have made since then in promoting reliable information about COVID-19 and expanding our policies to remove the more dangerous COVID and vaccine misinformation.”
The company also said it took a while to review and implement the changes.
However, the need to act urgently could not have been more obvious: at the time, US states were introducing vaccines for the most vulnerable – the elderly and the sick. And health officials were worried. Only 10% of the population received the first dose of the COVID-19 vaccine. According to a poll by the Associated Press-NORC Public Relations Research Center, a third of Americans thought to skip a shot entirely.
Regardless, Facebook employees admitted that they “had no idea” how bad the anti-vaccine sentiment was in the comment sections of Facebook posts. But a study by the company in February found that up to 60% of comments on vaccine reports were against or against the vaccine.
“This is a huge problem and we need to fix it,” the March 9 presentation said.
Worse, the employees of the company admitted that they did not know how to catch these comments. If so, Facebook did not have a policy to prohibit comments. The general principle allowed users to post vaccine reports from news agencies or humanitarian organizations with negative comments about vaccines.
“Our ability to detect (vaccine hesitation) in English comments is poor – and virtually nonexistent elsewhere,” another internal memo, published March 2, said.
Los Angeles resident Derek Beres, writer and fitness instructor, sees anti-vaccine content thriving in the comments every time he promotes vaccinations on his Facebook-owned Instagram accounts. Last year, Beres started a podcast with friends after they noticed conspiracy theories about COVID-19 and vaccines that were circulating on social media of popular health and wellness influencers.
Earlier this year, when Beres posted a photo of him receiving the COVID-19 vaccine, some on social media told him that he was likely to die in six months.
“The comments section is a real trash can for a lot of people,” Beres said.
The anti-vaccine comments on Facebook got so bad that even when prominent public health agencies like UNICEF and the World Health Organization urged people to take the vaccine, the organizations refused to use the free ads that Facebook gave them to promote vaccinations, according to the documents.
Some Facebook employees had an idea. While the company was working on a plan to curb any anti-vaccine sentiment in comments, why not turn off commenting at all?
“Very interested in your suggestion to remove ALL inline comments on vaccine posts as a workaround until we can sufficiently identify vaccine doubts in the comments to improve our removal,” a Facebook employee wrote on March 2.
The offer came to nothing.
Instead, Facebook CEO Mark Zuckerberg announced on March 15 that the company will begin labeling messages about vaccines that describe them as safe.
The move allowed Facebook to continue to gain high engagement – and ultimately profit – from anti-vaccination comments, according to Ahmed of the Digital Hate Response Center.
“They tried to find ways not to reduce engagement, but at the same time create the impression that they are trying to take some steps to fix the problems they caused,” he said.
“It’s unrealistic to expect a multi-billion dollar company like Facebook to voluntarily change a system that has proven to be so profitable,” said Dan Brahmi, CEO of Cyabra, an Israeli tech company that analyzes social media and disinformation. Brahmi said government regulations may be the only thing that can get Facebook to act.
“The reason they didn’t do it is because they shouldn’t have done it,” Brahmi said. “If it hurts bottom line, it’s not possible.”
Bipartisan legislation in the US Senate will require social media platforms to give users the ability to disable algorithms that tech companies use to organize individuals’ news feeds.
Senator John Thune, South Dakota, sponsor of the bill, asked Facebook informant Haugen to describe the dangers of engagement rankings during his speech to Congress earlier this month.
She said there are other ways to rank content – for example by source quality or chronological order – that will serve users better. The reason Facebook won’t consider them is because they’ll reduce engagement, she said.
“Facebook knows that when they choose content … we spend more time on their platform, they make more money,” Haugen said.
The leaked Haugen documents also show that the relatively small number of Facebook users who oppose vaccinations are rewarded with high page views, according to the tech platform’s current ranking system.
An internal Facebook study released on March 24 warned that much of the “problematic vaccine content” comes from multiple areas of the platform. In the Facebook communities where vaccine distrust was highest, the report reported 50% of anti-vaccine page views from just 111 – or 0.016% – Facebook accounts.
“Leading manufacturers are mostly users who regularly post content (vaccine hesitancy) for feeding,” the study says.
On the same day, the Digital Hate Response Center published an analysis of social media posts, according to which, between February and March, just a dozen Facebook users responded for 73% of posts against vaccinations. This was a study that Facebook leaders declared to the public in August as “flawed,” despite an internal study published months earlier confirmed that a small number of accounts were fueling anti-vaccine sentiment.
Earlier this month, an AP-NORC poll found that most Americans blame social media sites like Facebook and their users for disinformation.
But Ahmed said Facebook shouldn’t just take the blame for the problem.
“Facebook made decisions that led people to receive disinformation that caused them to die,” Ahmed said. “At this stage there should be a murder investigation.”
Seitz reported from Columbus, Ohio.