UK may regulate social media to limit harmful content
Tech firms are being warned by the British government to better police their sites or face legal penalties.
A proposed set of laws to be brought before the British Parliament will level criminal punishments against bosses of social media sites and tech firms responsible for spreading violent or harmful content.
The laws are proposed in the "Online Harms White Paper," a joint proposal by the Department for Culture, Media, and Sport and the Home Office was unveiled Monday. It aims to reduce the spread of harmful content on the Internet. It covers content inciting violence, disinformation, cyberbullying, children accessing inappropriate material and encouraging suicide.
Supported by Prime Minister Theresa May, the laws will allow the British government to fine or otherwise penalize these firms if they don't immediately act to remove violent content like terrorism, child pornography and graphic violence. It will also make tech firms liable for publishing fake news.
British government officials say the laws, when approved, will become "world leading laws to make the U.K. the safest place in the world to be online."
"The era of self-regulation for online companies is over," said Jeremy Wright, Secretary of State for Digital, Culture, Media and Sport.
The U.K. will also establish a new regulatory office to enforce these new laws. The regulator will be empowered to impose fines on social media companies if they fail to protect people from harmful content. It can also block access to some websites and make tech company executives liable for failing to block distribution of this type of content.
Prime Minister Theresa May said that while the Internet could be brilliant at connecting people, it had not done enough to protect users, especially children and young people.
"That is not good enough, and it is time to do things differently," said May. "We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe."
This duty of care will make companies take more responsibility for the safety of users. It will also confront the harm caused by content or activity on their services. The new regulator will set clear standards for online safety.
The proposed laws, however, raise difficult and unprecedented legal questions. It will ask lawmakers to determine if tech CEOs should be held directly liable for infractions and if regulators should have the power to restrict access to egregious content.
Facebook said it was looking forward to working with the British government to ensure the new laws are effective, echoing its founder Mark Zuckerberg's statement last week that government regulations are needed to have a standard approach across platforms.
There are violations of freedom of the press and freedom of expression issues inherent in the new laws, however.
Any new rule should strike a balance between protecting society and supporting innovation and free speech, said Rebecca Stimson, Facebook's head of UK public policy.
"These are complex issues to get right and we look forward to working with the government and parliament to ensure new regulations are effective," she noted.
This article originally appeared in IBTimes US.
This article is copyrighted by International Business Times, the business news leader