Social media giants face political backlash from UK and France for not policing terror content
New counter-terrorism drive comes on the eve of talks between Theresa May and Emmanuel Macron.
UK Prime Minister Theresa May and French President Emmanuel Macron have teamed up in bid to tackle online radicalisation, Downing Street announced on Monday 12 June.
News of the plan, which could see the likes of Facebook, Twitter and YouTube face fines if they fail to remove "unacceptable content", comes on the eve of May's visit to Paris on Tuesday.
"The counter-terrorism cooperation between British and French intelligence agencies is already strong, but President Macron and I agree that more should be done to tackle the terrorist threat online," the Conservative premier said.
"In the UK we are already working with social media companies to halt the spread of extremist material and poisonous propaganda that is warping young minds."
The initiatives follows three terror attacks in the UK, with the Westminster Bridge knife and car ramming in March, the Manchester suicide bombing in May and the London Bridge van and knife rampage in June, which left 35 people dead in total and dozens badly injured.
France is thought to have faced its most recent terror attack just over a week ago when suspect Farid Ikken, 40, was shot as he allegedly attempted to attack a policeman with a hammer near Notre-Dame Cathedral.
Ikken, who has been charged for attempted murder, reportedly radicalised himself online. London Bridge attack ringleader Khuram Butt, 27,who was shot dead within eight minutes of launching his attack, was inspired by watching pro-Jihadi videos on YouTube, a cousin of Butt's wife, Fahad Khan, has claimed.
Number 10 said the plans between the French and UK government include exploring the possibility of creating a new legal liability for tech companies if they fail to remove content.
The move could include penalties such as fines for companies that fail to take action. The politicians also want to work with technology firms to develop tools to identify and remove harmful material automatically.
'Failure to act is a disgrace'
The Home Secretary Amber Rudd and the French Interior Minister Gérard Collomb will meet in the coming days to drive forward the plan, Downing Street said.
Bosses from Facebook, Twitter and Google and faced the cross-party Home Affairs Select Committee in March.
The group of MPs, led Labour's Yvette Cooper, concluded that the companies are "shamefully far" from tackling illegal and dangerous content.
"Social media companies' failure to deal with illegal and dangerous material online is a disgrace. They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse," Cooper said.
"Yet repeatedly they have failed to do so. It is shameful. These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people's lives. This isn't beyond them to solve, yet they are failing to do so.
"They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe."
German minister announced in April that social media firms could face large fines if they fail to remove illegal content, hate speech or fake news. The draft legislation, if passed, could see the companies have to pay up to €50m ($63) if they do not remove the offensive content within a week.
© Copyright IBTimes 2024. All rights reserved.