Meta Drops Fact Checkers: What It Means for Trump's Presidency, Misogyny and Gender Bias
The move has gained the approval of Republicans and Elon Musk, but the Full Fact CEO warns about "chilling" consequences.
Meta is shifting away from independent fact-checkers on Facebook and Instagram in favour of a "community notes" system, where users themselves can assess the accuracy of posts. This controversial change has sparked debates about its potential impact on misinformation and the democratic process.
What Is Changing?
Meta's fact-checking programme, launched in 2016, relied on independent organisations to evaluate posts flagged as false or misleading. Such posts were labelled to provide clarity and often deprioritised in users' feeds. However, Meta is now transitioning to a community-driven system in the U.S., similar to the approach adopted by X (formerly Twitter) under Elon Musk.
This new system allows individuals with diverse perspectives to add context or clarifications to contentious posts. While Meta has confirmed that it has "no immediate plans" to phase out third-party fact-checkers in the UK or EU, the shift marks a significant change in its global strategy.
In response to concerns regarding sensitive topics such as self-harm and mental health, Meta reassured users that there would be "no change to how we treat content that encourages suicide, self-injury, and eating disorders."
The move has already drawn criticism. Chris Morris, CEO of fact-checking organisation Full Fact, described the change as "a disappointing and backwards step that risks a chilling effect around the world."
Support from Trump and Republicans
The changes have been met with approval from President-elect Donald Trump, who has often accused Meta of silencing right-wing voices. Meta CEO Mark Zuckerberg, in a recent video, explained that independent moderators were perceived as "too politically biased" and emphasised the company's commitment to free expression.
Zuckerberg stated that the updated approach would create a more "personalised" experience and reduce mistaken censorship, noting that approximately one in ten posts currently removed are wrongly censored.
Joel Kaplan, Meta's global affairs chief and a prominent Republican, acknowledged the good intentions behind independent moderation but admitted it had resulted in unintended censorship.
Trump praised Zuckerberg's presentation and commended Meta for its progress, stating that the platform had "come a long way."
Backlash Against Meta's Decision
Critics argue that this move aligns Meta with Trump's agenda. Ava Lee of Global Witness called the decision "a blatant attempt to cosy up to the incoming Trump administration," warning that it could allow platforms to avoid accountability for hate speech and misinformation.
Concerns have also been raised about the rampant spread of misinformation on Facebook during the 2024 U.S. elections. Trump's campaign used AI-generated content to sway voters, including images falsely portraying him surrounded by African American supporters.
AI and Misinformation in Elections
AI-generated misinformation surged during the 2024 elections, confusing voters, particularly older demographics unfamiliar with identifying manipulated images or videos. One widely circulated image falsely showed Taylor Swift fans wearing "Swifties for Trump" T-shirts, suggesting her endorsement of Trump's campaign. The image, created by the John Milton Freedom Foundation, was debunked, but not before it went viral.
Trump also shared fabricated AI images of Swift endorsing him, despite her public support for Democratic nominee Kamala Harris. These incidents underscore how unchecked misinformation can shape public perception and undermine democratic integrity.
Musk's Influence and X's Misinformation Woes
Meta's pivot towards community notes mirrors Musk's changes on X, which have also been criticised for enabling misinformation. X's lax policies have allowed misleading content about women and politics to proliferate.
For example, during the 2024 Paris Olympics, Algerian boxer Imane Khelif became the target of baseless claims about her gender on X. High-profile figures like Musk and J.K. Rowling amplified these falsehoods, leading to cyberbullying and harassment.
Despite evidence affirming Khelif's biological sex, the platform failed to address the misinformation, prompting Khelif to file a legal complaint. This case highlights the dangers of unchecked misinformation and the potential harm it can cause to individuals.
Implications of Meta's New Approach
Meta's shift to a community-driven fact-checking model raises significant concerns about its ability to combat misinformation while promoting free speech.
Critics worry that without rigorous oversight, platforms could become breeding grounds for false information, further polarising society and eroding trust in democratic processes.
As the world grapples with the challenges of misinformation, Meta's decision to abandon independent fact-checkers may set a worrying precedent, leaving questions about the balance between free expression and the need for factual accuracy.
© Copyright IBTimes 2024. All rights reserved.