UK teen died after 'negative effects of online content': coroner
A 14-year-old British girl died from an act of self-harm while suffering from the "negative effects of online content."
A 14-year-old British girl died from an act of self-harm while suffering from the "negative effects of online content", a coroner said Friday, in a case that has shone a spotlight on social media companies.
Molly Russell was "exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness," Andrew Walker ruled at North London Coroner's Court.
The teenager "died from an act of self-harm while suffering depression", he said, but added it would not be "safe" to conclude it was suicide.
Some of the content she viewed was "particularly graphic" and "normalised her condition," said Walker.
Of the 16,300 posts Russell saved, shared or liked on Instagram in the six-month period before her death, 2,100 related to depression, self-harm or suicide, the inquest was told.
Russell, from Harrow in northwest London, died in November 2017, leading her family to set up a campaign highlighting the dangers of social media.
"Molly was a thoughtful, sweet-natured, caring, inquisitive, selfless, beautiful individual -- although a few words cannot possibly encapsulate our wonderful girl," her father Ian said in a statement.
"We have heard a senior Meta (Instagram parent company) executive describe this deadly stream of content the platform's algorithms pushed to Molly as 'safe' and not contravening the platform's policies.
"If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly.
"It's time the toxic corporate culture at the heart of the world's biggest social media platform changed," he urged.
The week-long hearing became heated when the family's lawyer, Oliver Sanders, took a Meta executive to task.
A visibly angry Sanders asked Elizabeth Lagone, the head of health and wellbeing at Meta, why the platform allowed children to use it when it was "allowing people to put potentially harmful content on it".
"You are not a parent, you are just a business in America. You have no right to do that. The children who are opening these accounts don't have the capacity to consent to this," he said.
Lagone apologised after being shown footage, viewed by Russell, that "violated our policies".
In a statement issued following the ruling, a spokeswoman for Meta said: "Our thoughts are with the Russell family and everyone who has been affected by this tragic death.
"We'll continue our work with the world's leading independent experts to help ensure that the changes we make offer the best possible protection and support for teens," she added.
Children's charity NSPCC said the ruling "must be a turning point", stressing that any delay to a government bill dealing with online safety "would be inconceivable to parents".
"Tech companies must be held accountable when they don't make children's safety a priority," tweeted the charity.
NSPCC chief executive Peter Wanless said the ruling should "send shockwaves through Silicon Valley".
"The magnitude of this moment for children everywhere cannot be understated," he added.
© Copyright AFP 2024. All rights reserved.