In a bid to improve its content moderation, Instagram is rolling out an updated policy change — this time, it focuses on the appeals process and some sort of a "warning system."
Account Disable Policy
Instagram announced Thursday that it's making some changes in its account disable policy. Before this, Instagram's way of disabling accounts has been an issue. In its previous policy, the social media company automatically disables accounts that have posted a certain percentage of content that went against the guidelines.
This has caused some banned users to be caught off guard upon seeing their accounts disabled without any warning or reason whatsoever.
However, the Facebook-owned company will now start giving out warnings to accounts that violate rules before deleting them entirely.
In the updated policy, users will now first receive an alert showing their history of stories, photos, or comments and explaining why they may have violated Instagram's guidelines. This will also serve as a warning when their accounts are close to being banned.
Aside from this, the new policy update covers the new appeals process where Instagram will give users a chance to explain or appeal moderation decisions directly through the alert. This way, users won't have to go to Instagram's Help Center on the web.
As of the moment, only a number of content can be appealed through the new policy. These include deleted content that show hate speech, nudity and pornography, harassment, and bullying. Instagram says it'll eventually cover more appeals in the future.
Other Instagram Updates
Recently, Instagram also started testing other features that will address bullying on its platform. Reports point to a feature that will use artificial intelligence to flag hateful comments. A new Restrict mode is also in the works. Almost like blocking someone, Restrict mode will allow users to filter comments from someone they restricted. This mode will also help users hide their account activity to those they restrict, among other things.