Facebook says it is helping reduce Covid vaccine 'hesitancy'
Facebook said its machine learning tools continue to make progress in filtering inappropriate content such as hate speech.
Facebook said Wednesday vaccine "hesitancy" is declining in the United States and other countries, and credits its efforts to filter out misinformation and promote authoritative information for helping the trend.
In releasing its quarterly transparency report, Facebook said the latest data showed vaccine hesitancy is down by 50 percent among US users of the social network, with significant declines in other countries.
The news comes a month after a public spat between Facebook and the White House after President Joe Biden claimed Facebook was "killing people" by allowing vaccine misinformation to spread.
Facebook said it removed some 20 million pieces of content, issuing warnings for millions more, and blocked 3,000 accounts for violating its policies on Covid-19 misinformation, while at the same time connecting users with reliable sources of health information.
"We are focused on outcomes, which we believe are the right way to evaluate the end result," Facebook vice president for integrity Guy Rosen told reporters.
"For example, for people in the US on Facebook, vaccine hesitancy has declined by 50 percent and we've similarly seen vaccine assessments rising globally."
The data comes from a long-running survey of Facebook users conducted with Carnegie Mellon University and the University of Maryland.
"This is all movement in the right direction," he said.
Rosen noted that 18 million people have used Facebook's profile frames supporting vaccines.
"It's most important for people to see their friends and family supporting vaccinations," he said.
Facebook said its machine learning tools continue to make progress in filtering inappropriate content such as hate speech.
Rosen said the prevalence of violating content -- which Facebook claims is the best way to measure its effectiveness in filtering -- was just 0.05 percent in the second quarter of the year.
That translates to five pieces of inappropriate content per 1,000 views, he noted.
"Prevalence is our primary metric... it matters because it captures not what we took down but what we missed and what was ultimately seen by people," Rosen said.
Facebook did not include data on the prevalence of Covid-19 misinformation, saying this is a fast-evolving landscape.
"We now have more than 65 specific claims that we removed from our platforms around Covid-19 and vaccines because they are false and may contribute to the risk of imminent physical harm during the pandemic," said vice president for content policy Monika Bickert.
"We're continuing to add to this list as new trends emerge. For instance, in the past month, we added to our list claims that Covid-19 vaccines cause Alzheimer's, that the vaccines cause magnetism and that being around vaccinated people could cause secondary side effects to others."
Copyright AFP. All rights reserved.
This article is copyrighted by International Business Times, the business news leader