Imgur, the popular image hosting and sharing platform, announced it would remove explicit NSFW (Not Safe for Work) images starting next month.

The new Terms of Service, which will take effect on May 15, 2023, will also focus on removing old, unused, and inactive content not associated with a user account from the platform.

Web Summit 2018 In Lisbon
(Photo: Diarmuid Greene /Web Summit via Getty Images)
LISBON, PORTUGAL - NOVEMBER 07: Alan Schaaf of Imgur, on PandaConf Stage during day two of Web Summit 2018 at the Altice Arena on November 7, 2018, in Lisbon, Portugal. In 2018, more than 70,000 attendees from over 170 countries will fly to Lisbon for Web Summit, including over 1,500 startups, 1,200 speakers, and 2,600 international journalists.

Automated Detection Software

Imgur clarified that removing explicit content would not affect its Community Rules, and content in the public gallery will remain unchanged. However, the platform must utilize automated detection software to identify explicit content. 

Imgur will continue to use human moderators alongside automated detection to ensure that the detection process is efficient and that content submitted to the public gallery will continue to be subject to human review.

The mix of automated detection and human review is expected to minimize any negative impact this change may have on the Imgur community. Users are still encouraged to report and flag any content that believes should be reviewed by the moderation team.

Imgur's previous policies caused confusion among users, as its Community Rules and Terms of Service applied to all community interactions on the site, while the Terms of Service applied to all content on the platform.

The policy change is intended to make Imgur's policies clearer and more consistent across the platform. The platform has decided to disallow explicit and illegal content, as it claims to have posed a risk to the Imgur community and its business. 

However, Imgur is currently calibrating its automated detection, which may result in some content that was previously allowed under "artistic exceptions" not being applied in the future. 

Read Also: AI Generator Bans Words About Human Reproductive System To Prevent Pornographic Content 

How the Changes will be Implemented

In order to provide transparency for its users, Imgur has specified how the upcoming changes will be implemented.

If a user who is signed in to their account uploads content that violates the new Terms of Service, they will receive a notification, and the content will be deleted automatically. Once deleted, the content will no longer be viewable through a direct link or in the post where it was originally created. 

If the detected violation is found in only a portion of a post that contains other non-violating content, only the flagged content will be removed while the rest of the post remains visible. 

The platform has emphasized that no account warnings or suspensions will be issued about these automated flags. It hopes that these changes will provide greater clarity and consistency in its policies while also ensuring the safety and prosperity of the community. 

"We understand that these changes may be disruptive to Imgurians who have used Imgur to store their images and artwork," Imgur writes in its new Terms of Service.

"These changes are an important step in Imgur's continued efforts to remain a safe and fun space on the internet. As always, we welcome and value your feedback to help us refine our efforts toward this goal."

Related Article: U.S. Federal Reserve Governor Waller Cancels Speech After Experiencing Zoom Hijacked

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion