Sorry Facebook, but your AI news algorithms are clueless and need fixing
Facebook Trending has hit a new low in promoting content that isn't relevant and sometimes, actually fake.
Facebook recently replaced all the human editors that select articles for its Trending section with an artificially intelligent computer algorithm that somehow chooses news topics by looking to see which topics are being most commonly discussed and shared on the social network, but it seems like this is a big mistake.
In a long blog post, on Friday 26 August Facebook announced that it would be changing how its Trending section is run for the first time since the feature was introduced in January 2014, in order to make it "more automated" and remove the need for "people to write descriptions for trending topics".
However, as we all know (and admitted by Facebook itself), the real reason Facebook has got rid of the journalists that chose stories for the Trending section is due to continued accusations about political bias affecting what sorts of stories are shown to users.
But putting a robot in charge of selecting the news, while it might seem cool and hip, is a bad idea. Already over the weekend, in the first three days of the AI algorithm being active, numerous news reports have highlighted that Facebook's computers are selecting fake stories, such as false news about Fox News host Megyn Kelly being fired from the TV network for backing Hillary Clinton.
The robot also made highly indelicate news selections, such as choosing to highlight an article about a viral video showing a man masturbating with a McChicken burger from McDonald's, as well as news about a Saturday Night Live comedian using a four-letter profanity live on TV to diss a right-wing US political commentator called Ann Coulter.
Facebook, what are you doing?
No offence, Facebook, but if you've been listening to all that talk about robots allegedly taking over our jobs, then you should really know better. As the Darpa Robotics Challenge last year clearly showed, if robots can't even open doors and walk through doorways without falling down, then they won't be replacing humans anytime soon.
Similarly, the robot is clearly clueless if it thinks that the most important topic on people's minds in the world (looking at Facebook at 12pm BST on 30 August) is the fact that the Pokemon Eevee is now available as a plushie from Build-a-Bear Workshop, or the fact that some TV cook called Joe Wicks recommends his own recipe for grilled salmon, when so much is going on every week, from natural disasters to terrorism in the Middle East to political machinations across the world.
A flick back through the algorithm's trending story choices over the weekend captured by irate Twitter users shows a similar lack of consciousness about anything remotely news-related. A trending topic about actress Mila Kunis links through to a completely un-time sensitive article entitled, "13 rare facts about That 70s Show you'll hear for the first time", while the topic "Watch Dogs 2" links through to an absolute gem of news from a clickbait site full of native advertising called "Watch this dog's reaction when this man gets home after two years abroad!"
Seriously Facebook, what are you doing? Hilariously, Facebook maintains in its blog post that "there are still people involved in this process to ensure that the topics that appear in Trending remain high-quality", but clearly whoever these people are, they must be sleeping on the job, if they exist at all.
Trending could eventually be scrapped altogether
Facebook is failing its users by taking away the most useful aspect of all from the Trending feature – the summary line written by editors that sums up the news so you know what it is in a hurry without needing to trawl through multiple news sources. The new simplified topics don't tell you what the news is at all, just that you should be concerned about a topic when there isn't even any news about it at all.
Add to that, but the rumours on the grapevine is that this might just be a last-ditch attempt by Facebook to demonstrate a lack of political bias before completely pulling the feature from the website, according to ex-members of the Trending team that spoke anonymously to media industry news site Digiday.
"The topics are just wrong – they have bad articles and insufficient sources. I think they are just going to get rid of the product altogether, because there is going to be backlash when people who do use the tool realise that the quality has gone down – unless there are severe algorithmic changes that improve the quality of the topics," the former Facebook trending editor said.
"I expected that we were going to get laid off and had already started applying elsewhere about a month and a half ago. You know how it looks like now? With just a simplified topic and the number of people talking about it? We saw that before anybody else did, and a few of us put two and two together and figured that it was probably how it was going to look like; otherwise, they wouldn't be testing it on Facebook employees."
© Copyright IBTimes 2024. All rights reserved.