After the ruling of the British court on Sept. 30, indicating that social media is, in fact, responsible for Molly Russell's suicide case in 2017, the platform companies are now forced to account for their level of safety.

The inquest that was held into 14-year-old Russell's death shed light on a few questions regarding social media and its effects on the younger generation.

During the court trial that lasted for two weeks, it became clear that stricter regulations were needed for the internet.

Meta's Controversial 'Safe' Assessment

The head of health and wellbeing of Meta, a technology company that owns Facebook and Instagram, testified during the investigation last month.

Elizabeth Lagone was shown a selection of the Instagram posts Russell had viewed six months before her passing. Lagone would provide feedback on whether or not it adhered to Instagram's criteria at the time.

According to a report by The Guardian, Lagone used terms like "sharing of feelings." She stated that content related to suicide and self-harm might be allowed if it indicates an attempt to improve awareness of a user's mental status and convey their emotions.

In the end, the majority of those photos were deemed to be "safe" for children to view. Many people in the North London coroner's courtroom disagreed with the opinion.

The teenager's father, Ian Russell, expressed his disappointment with Lagone's assessment.

"If this demented trail of life-sucking content was 'safe,' my daughter Molly would probably still be alive, and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly," said Mr. Russell.

Lagone apologized as some of the material Molly had accessed violated the Instagram policies at the time, which glorified self-harm activities.

Read Also: Social Media Is in Blame for Teenager Molly Russell's Suicide Case in 2017, Says the British Court

The Online Safety Bill

The online safety bill is anticipated to restart its march through parliament soon. The newly appointed cultural secretary, Michelle Donelan, confirmed that the elements of the bill to safeguard minors would be improved.

In order to shield the youth from potentially harmful information and systems, the bill imposes a duty of care on technological platforms.

Donelan's immediate predecessor, Nadine Dorries, issued a written statement in July. She made it clear that the act would encompass the kind of content that Molly was able to view.

Content that encourages self-harm or suicide should never be made available to children, but content that discusses how to recover from suicidal feelings should be allowed if it is age-appropriate.

After Lagone's testimony in the court, Meta's standard for determining whether the content is appropriate or not will be evaluated by Ofcom, the communications regulator. Therefore, the company will not make the decision.

Related Article: Meta and Snap Sued by Mother of 11-Year Old Who Passed Away Due to Social Media Addiction

This article is owned by Tech Times

Written by Trisha Kae Andrada

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion