Facebook continues its war on fake news with new measures
The social networking giant keeps improving the health of its platforms.
In the latest tacit admission that it needs to be an "arbiter of truth," Facebook (NASDAQ:FB) just unveiled more steps that it is taking to curb the spread of misinformation and fake news on its platform. The company hosted a small group of journalists at its headquarters and then further clarified its strategy of "Remove, Reduce, Inform," which it has been utilizing since 2016. The world's largest social networking company has grappled with ongoing controversies and criticism that it inadvertently facilitates the spread of misinformation, and has started cracking down on malicious actors.
Here's what investors and Facebook users need to know.
Removing content that violates Facebook's policies
Facebook's policies have long outlined certain types of content that is prohibited, such as harassment, hate speech, and bullying. The challenge has always been around enforcing these policies at scale, considering the company has 2.7 billion users that access one of its core services every month. Facebook says it will use a "combination of the latest technology, human review and user reports" to step up its enforcement and remove harmful groups, regardless of whether those groups are public or private.
The company notes it is improving its ability to proactively detect harmful content, echoing something CEO Mark Zuckerberg has reiterated numerous times. "The most important work here is to keep executing our road map to build systems that can proactively identify harmful content so we can act on it sooner," Zuck said in January.
Facebook will roll out a new section of its Community Standard site that allows users to see updates and changes more easily, while holding administrators of offending groups more accountable for violations. A new feature called Group Quality will summarize violating content that has been removed or flagged, as well as call out misinformation spread within the group.
Reducing the reach of fake news
Facebook has been partnering with third-party professional fact-checkers but notes that its partners "face challenges of scale." One idea the company is exploring is having users identify journalistic sources that either "corroborate or contradict" claims. This system has clear potential to be abused, which Facebook says it will try to combat with safeguards that it has not detailed yet.
The Associated Press is a renowned fact-checking and news organization that Facebook partnered with back in 2016, and the duo are expanding that partnership by having AP debunk false or misleading information on the platform. That will now include Spanish-language content.
Facebook is also tweaking its News Feed algorithm to reduce the reach of any content that has been flagged as false by independent fact-checkers. Many sites link traffic out more often than they get linked to, which is often indicative of low-quality content. Facebook will begin measuring this discrepancy, which it calls Click-Gap, and incorporate it into the News Feed algorithm as well.
Informing users
Facebook added a Context Button to text-based posts recently and will soon add a similar feature to images since misinformation can also often go viral in the form of visual memes. This feature is still in the testing phase in the U.S. There will also be new Trust Indicators, which provide more information about a particular news organization, added to Context Buttons. Facebook will also start adding more information to Page Quality tabs, while allowing users who have left groups to remove their content from those groups after they have left.
Messenger will get Verified Badges in an effort to reduce fraudulent activity, while allowing users to have more control over who they block from contacting them.
Facebook may not be the entity ultimately making the decision on what qualifies as misinformation and fake news, but it is now taking greater responsibility by providing tools to help its massive user base better discern truth from lies.
This article originally appeared in the Motley Fool.
Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to its CEO, Mark Zuckerberg, is a member of The Motley Fool's board of directors. Evan Niu, CFA owns shares of Facebook. The Motley Fool owns shares of and recommends Facebook. The Motley Fool has a disclosure policy.