Instagram has banned memes, drawings, content that may push audiences to self-harm.
(Photo : Pixabay)

After the suicide of 14-year-old British teen Molly Russel last 2017 after seeing self-harming images on Instagram, the company has taken a series of steps preventing the spread of graphic, suicide-related images posted on its platform by banning graphic images of self-harm, fictional or not. 

In a blog entry last February by Adam Mosseri, Head of Instagram, he announced a series of changes to the platform's uploading policies after a comprehensive review with global experts. He also included academics on youth, mental health, and suicide prevention on the said meeting. He mentioned that they "...will not allow any graphic images of self-harm, such as cutting on Instagram." He proceeded to claim that they never allowed posts that encourage self-harm or suicide. Finally, he promised that the company will continue removing such content when reported.

Just this October, he said that they prevented more types of self-harm or other suicide content. According to him, they "...will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery," and that they "...will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods."

Technology to the Rescue

In a similar statement, Mosseri said that they were "focused on getting more resources to people posting and searching for self-harm related content and directing them to organizations that can help." He went on to say that accounts found to be sharing such content are not recommended. It is also in this blog where the company mentioned looking into sensitivity screens.

Sensitivity screens are a new feature by Instagram where disturbing images, including those of self-harm and suicide, are hidden or blurred to a certain extent before the user decides to open the content. Previously in 2016, Instagram already introduced a suicide prevention tool because the company believed that the first signs of depression are shown in social media.

The tool works by reporting your friend's post, a friend whom you feel is having a difficult time. It will then send a message to your friend saying that "someone saw one of your posts and thinks you might be going through a difficult time. If you need support, we'd like to help." They'll then get the option to talk to a friend, contact a helpline, or receive tips and support. But amidst this, Russel still succumbed to suicide.

Now in a recent blog posted by Adam Mosseri, he reported that for the last three months, Instagram either removed, reduced the visibility of, or added sensitivity screens to around 834,000 pieces of content, 77% of which were found by the company before being reported. However, the main goal of the company is the total removal of such images.

Although young people online mention the help that they get whenever they are struggling by sharing photos of healed scars or talking about eating disorders, experts from academics and mental health organizations like the Samaritans in the U.K. and National Suicide Prevention Line in the U.S. mentioned the balance between the outcome of these contents should be kept—that's why they proceeded to the ban of triggering content. 

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion