Instagram users, heads up: you might need to prove to IG that you're not a robot in order to access your account.

Instagram logo
(Photo : KIRILL KUDRYAVTSEV/AFP via Getty Images)
A picture taken on October 18, 2021 in Moscow shows the US social network Instagram's logo on a tablet screen.

The Verge reports that the image sharing social media giant has been asking some users to take video selfies in order to verify their identities. Only then can they allegedly use their accounts like normal.

Several users have gone to Twitter to post about it, and they're not happy by all means. One of the people affected by the new Instagram ruling is Matt Navarra, a social media consultant:

As per Navarra's tweet, he calls out Instagram (which is owned by Mark Zuckerberg's Meta) for the rule, claiming that this is completely against the company's promise to not collect biometric data from its users.

The original report comes from XDA Developers, who says that Instagram allegedly began testing the feature last year. It didn't push through at first due to technical issues, but it seems to have been implemented now.

Instagram also promises that the video selfies they're using to verify user accounts will never be visible on the platform, and are to be deleted after 30 days.

The social media giant also claims that they're neither using their facial recognition tech for the selfies, nor collecting biometric data from users. Needless to say, however, some people are livid:

Twitter Error

There's no word on whether this is an official rule change or IG/Meta is only doing a slow roll-out. Meta has also not provided an official statement on the matter at the time of this writing.

Read also: Instagram Rolls Out Collabs, a Feature that Lets Two Users Share a Post

Instagram's MASSIVE Bot Problem

One could say that Instagram's decision to implement the video selfie verification is to combat the ever-increasing number of bots in its network. And that would be right.

IG has long had a bot problem, and it's not exactly a secret. According to Business2Community, there's an estimated 95 million bot accounts on the image sharing network, comprising almost 10% of the social media giant's 1 billion users.

Instagram event
(Photo : Justin Sullivan/Getty Images)
MENLO PARK, CA - JUNE 20: An attendee takes a photo of the instagram logo during a press event at Facebook headquarters on June 20, 2013 in Menlo Park, California. Facebook announced that its photo-sharing subsidiary Instagram will now allow users to take and share video.

There are so many other topics about Instagram bots, making it a topic for another day. But one thing you should know is that their presence is often considered a problem on the network, and the company has been trying to fight them off for years.

Meta Going Back On Its Promises?

Instagram's parent company Meta (back when it was known as Facebook) has already gotten in trouble for allegedly collecting biometric information under users' noses. The same thing goes for their deployment of facial recognition technology.

In January, they were ordered to pay eligible users $350 each if it can be proven that their biometric data was collected without their permission. This settlement, however, was only applicable to residents in the state of Illinois.

If Instagram plans to keep this video selfie verification rule going, they're likely going to get into another scrap with their users--and possibly the authorities.

This is a developing story, so check back here at Tech Times for updates.

Related: Instagram Subscription Feature for Creators to Roll Out Soon As Seen on Apple App Store-What to Expect

This article is owned by Tech Times

Written by RJ Pierce

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion