Facebook has rolled out an artificial intelligence-powered suicide prevention program globally, as part of the social network's ongoing efforts to build a safe community for its users.

The program is a more advanced version of the suicide prevention tools that Facebook rolled out last year, using AI to identify and provide help to users who are in need of it.

Facebook AI-Powered Suicide Prevention Program

In a Facebook Newsroom post, the social network's VP of Product Management, Guy Rosen, discussed Facebook's next steps in its suicide prevention program.

Facebook started testing AI to improve its suicide prevention efforts in March, with initial efforts only focused within the United States. Rosen, however, said that Facebook is now expanding the program to the rest of the world, except in the European Union where data privacy laws prohibit the technology from being used.

As mentioned previously, the social network's AI-powered suicide prevention program uses pattern recognition technology for the identification of Facebook posts and Facebook Live videos that are expressing suicidal thoughts. The tool also picks up signals from the comments on the post or video, such as through friends asking "Are you ok?"

Once the system identifies a Facebook user who may have suicidal thoughts, the post or video will be routed to a dedicated suicide and self-harm review team hired by the social network to determine if intervention is necessary. Facebook is working to increase the accuracy of the tool so that the team will not be flooded with false positives.

The review team can then contact first responders to provide help to the identified Facebook user.

Help Facebook Prevent Suicides

Before the integration of AI into the tools, Facebook's suicide prevention program relied on reports by users if they sense that one of their friends needs intervention. With the program now powered by AI, it eliminates the time lost in requiring other users to report possible suicidal thoughts in posts or videos. For people who need help, those extra seconds might mean a huge difference.

In addition, with AI now powering the system, there will likely be less cases that will not be detected, as posts and videos can be tagged for suicidal thoughts even without any of the user's friends noticing.

However, even with AI on board, users can still report to Facebook if one of their friends appears to be having suicidal thoughts. The review team will go through each report as quickly as it can, and prioritize the most serious cases.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion