Facebook is in the crosshairs of the US government again. The White House is critical of the social network’s role in the increase of COVID-19 vaccine misinformation. It was found as a key barrier in the nation’s recovery from the pandemic.
US President Joe Biden told a reporter on Friday that Facebook has allowed the vaccine conspiracy theories to spread.
Reporter: “On Covid misinformation, what’s your message to platforms like Facebook?”
Biden: “They’re killing people” pic.twitter.com/SsSksFzytZ
— Bloomberg Quicktake (@Quicktake) July 16, 2021
A day before the interview, the White House noted its regular communication with social media platforms to inform them of the latest narratives that pose a public health danger.
“We work to engage with them to better understand the enforcement of social media platform policy,” says White House press secretary Jen Psaki.
Facebook has responded quickly to Biden’s remarks. A spokesperson has told ABC News that it will not bend to unfounded accusations.
“At a time when COVID-19 cases are rising in America, the Biden administration has chosen to blame a handful of American social media companies. While social media plays an important role in society, it is clear that we need a whole of society approach to end this pandemic. And facts – not allegations – should help inform that effort. The fact is that vaccine acceptance among Facebook users in the US has increased. These and other facts tell a very different story to the one promoted by the administration in recent day,” adds Facebook in an official response titled Moving Past the Finger Pointing.
The post highlights the studies showing how Facebook addressed vaccine hesitancy. The social network says its users are less resistant to the vaccine due to its initiatives. This opposes Biden’s remarks.
Academic research shows no conclusive link to the rise in vaccine hesitancy and Facebook sharing, despite the administration’s ongoing claims.
Facebook has been active in dismissing its role in the COVID-19 vaccine misinformation. It has shifted to polarizing, extremist content as bad for business.
“All social media platforms, including but not limited to ours, reflect what is happening in society and what’s on people’s minds at any given moment. This includes the good, the bad, and the ugly. For example, in the weeks leading up to the World Cup, posts about soccer will naturally increase – not because we have programmed our algorithms to show people content about soccer but because that’s what people are thinking about. And just like politics, soccer strikes a deep emotional chord with people. How they react – the good, the bad, and the ugly – will be reflected on social media,” explains Facebook.
Facebook Global Affairs VP Nick Clegg also took his stance in March:
“The goal is to make sure you see what you find most meaningful – not to keep you glued to your smartphone for hours on end. You can think about this sort of like a spam filter in your inbox: it helps filter out content you won’t find meaningful or relevant, and prioritizes content you will.”
“For example, Facebook demotes clickbait (headlines that are misleading or exaggerated), highly sensational health claims (like those promoting “miracle cures”), and engagement bait (posts that explicitly seek to get users to engage with them).”
Clegg said Facebook implemented a change to the News Feed algorithm back in 2018. And it gave more weight to updates from your close connections and groups over Page content that you follow.
Facebook claims it has nothing to gain from played up content and left-of-center conspiracy theories. And it penalizes such sensationalism.
But the larger evidence disproves Facebook’s stance.
Facebook has a role
The New York Times reported last week that Facebook is changing how its data analytics platform works. It has restricted public access to insights. It now shows far more far-right content and misinformation than more balanced coverage and reports.
A 2019 study by MIT found that false news stories on Twitter tend (70%) to be retweeted than fact-based content.
Further research have found that belongingness and a sense of community has solidified groups built on lies and misinformation as a psychological response.
“When you post things [on social media], you’re highly aware of the feedback that you get, the social feedback in terms of likes and shares. So when misinformation appeals to social impulses more than the truth does, it gets more attention online, which means people feel rewarded and encouraged for spreading it,” explains Yale University social psychologist William J. Brady.
In this regard, Facebook is rather right to blame human nature as the culprit. But social media platforms like Facebook forgot to mention that they gave these people the medium to share and the incentives to keep posting.
The more time Facebook users spend on the platform, the more money the social network generates.
The US government is right to investigate further.