Tragic schoolgirl Molly Russell liked suicide videos of ‘the most distressing nature’ before she took her own life, an inquest heard today.
The 14-year-old schoolgirl, from Harrow, northwest London, researched self-harming and suicide online before she died in November 2017.
The inquest will examine algorithms used by social media firms to channel content to users and keep them hooked.
Molly’s family also want the inquest to consider 29 internal Meta documents which allegedly set out research into the impact of self-harm and suicide of material online on teenagers.
Before showing the footage, coroner Andrew Walker warned the court that the videos were ‘almost impossible to watch.’
Tragic schoolgirl Molly Russell (pictured above) liked ‘glamorising’ suicide videos of ‘the most distressing nature’ before she took her own life, an inquest in London was told today
Elizabeth Lagone, Meta’s head of health and well-being arrives at Barnet Coroner’s Court
He said: ‘The video content could be edited, but Molly had no such choice. My view is that the video footage should be played as it stands alone.
‘Be warned, the footage glamorises suicide. It is of the most distressing nature. It is almost impossible to watch.
‘I say this especially to members of Molly’s family, but in my view the video footage ought to be seen.’
Her family decided to stay in the courtroom as the videos were played.
Social media content, all ‘liked’ by Molly before her suicide, showed people falling off buildings, jumping in front of trains and others hanging from a noose.
Some were shown cutting themselves with blades and even shooting themselves in the head.
Molly, from Harrow, liked distressing videos she watched on social media, a court heard
The words ‘fat’, ‘worthless’ and ‘suicidal’ flashed across the screen between videos over the backdrop of aggressive music.
It was unclear whether some videos took clips from TV dramas and film, or whether it displayed real life events.
On Friday, the head of health and wellbeing at Instagram’s parent company Meta, Elizabeth Lagone, defended the social media platform’s content policies – saying suicide and self-harm material could have been posted by a user as a ‘cry for help’.
Ms Lagone told the court it was an important consideration of the company, even in its policies at the time of Molly’s death, to ‘consider the broad and unbelievable harm that can be done by silencing (a poster’s) struggles’.
Instagram’s guidelines at the time, which were shown to the court, said users were allowed to post content about suicide and self-harm to ‘facilitate the coming together to support’ other users but not if it ‘encouraged or promoted’.
Ms Lagone took to the witness box after the videos had been played and said: ‘My starting point is the internet is a very dangerous place for those who enter into it.
‘Every effort should be made to make that journey as safe as possible.’
Asked by the family’s lawyer Oliver Sanders KC whether it was obvious it was not safe for children to see ‘graphic suicide imagery’, the executive said: ‘I don’t know… these are complicated issues.’
Judson Hoffman, Global Head of Community Operations at Pinterest, leaves court yesterday
Mr Sanders drew the witness’s attention to experts who had informed Meta it was not safe for children to view the material, before asking: ‘Had they previously told you something different?’
Ms Lagone responded: ‘We have ongoing discussions with them but there are any number of… issues we talk about with them.’
Addressing Molly’s family’s claims about internal research, Ms Lagone told the court she was not aware of any research done by the tech giant into how content affected users of its platforms.
Questioning the Meta executive about research into the impact of self-harm related content on users, Coroner Andrew Walker asked if any internal research had been conducted.
She said: ‘I’m not aware of specific research on the impact of content. That would be very difficult research to undertake with ethical considerations.’
Ms Lagone later added: ‘We are confident our policies do consider the needs of our youngest users.’
On Thursday, Pinterest’s head of community operations, Judson Hoffman, apologised after admitting the platform was ‘not safe’ when the 14-year-old used it.
Mr Hoffman said he ‘deeply regrets’ posts viewed by Molly on Pinterest before her death, saying it was material he would ‘not show to my children’.
The inquest, due to last up to two weeks, continues.
router-tech.com.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by . The content will be deleted within 24 hours.