Facebook has announced that the social media platform will make improvements on how it handles content relating to suicide and self-harm.

After several months of consultations with mental health experts from around the world, the company has decided to tighten policies around posting sensitive images to protect vulnerable users and expand available resources for those who are in need. The changes were announced on World Suicide Prevention Day, Sept. 10.

"As a global online community, keeping people safe on our apps is incredibly important to us," wrote Facebook's Global Head of Safety Antigone Davis.

Safer Facebook

Facebook's new policy no longer allows users to post graphic cutting images to prevent triggering self-harm. Similar photos will also become more difficult to find on Instagram.

Meanwhile, images of healed cuts, while permitted, will have a sensitivity screen over them to prevent the site from unintentionally promoting self-harm.

The social media site is also addressing how eating disorders are presented online. Facebook said that users are prohibited from posting content that promote eating disorders, including instructions for drastic weight loss and depiction of ribs, collar bones, thigh gaps, concave stomachs, or protruding spine that are shared with related keywords.

The site will, however, continue to send resources to users who post content related to self-harm, eating disorders, and suicide.

From April to June this year, Davis said that Facebook has taken action on over 1.5 million content relating to suicide and self-injury. Around 95 percent of the posts were found before they were reported by other users.

Meanwhile, Instagram has either removed or added sensitivity screens over 800,000 posts during the same time period — 77 percent of the content were immediately found before the app received a report.

As part of the efforts to make Facebook a safer space for those struggling with mental health issues, the company will hire a health and well-being expert to help improve the social media platform's safety policy.

Providing Support For Those In Need

Facebook will also include Orygen's #chatsafe guidelines to the site's Safety Center. The #chatsafe was developed to provide support to those who are responding to someone who post suicide or self-harm related content or those who want to share their experiences with suicidal thoughts and behaviors to others.

The popular social media site has been making improvements to help prevent suicide and self-harm. In 2017, artificial intelligence-based suicide and self-harm prevention tools were rolled out to immediately respond to those in need.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion