Microsoft will reportedly limit the companies in accessing its AI-based facial recognition tools, which depict gender, emotion, or the age of the subject.

According to the company, the algorithms have proven that these technologies showed inconsistencies and bias. The move will be a part of the firm's AI ethics overhaul.

Microsoft Limits the Use of Facial-Recognition Tools

Microsoft Restricts Use of Facial Recognition Tools as Part of AI Ethics Overhaul
(Photo : Turag Photography from Unsplash)
Microsoft will now refrain from selling facial-recognition programs that infer human attributes such as age, emotion, and gender.

According to a report by Bloomberg, those customers who still have the AI-based tools can access the software for another year before their expiration.

In line with the ethical guidelines of Microsoft about AI, the company aims to tweak some features that affect the "responsible AI standard." While others will only undergo some changes, some programs will be required to be pulled out.

For example, Microsoft's facial recognition technology which is better known as the Azure Face, is a popular tool among tech firms. One company that relies on this is Uber, a ridesharing company.

Uber uses this software to verify the identity of a person. Since Microsoft will now limit the use of this tool, the companies are now required to apply before they get hold of the facial-recognition technologies.

In this way, the Redmond corporation can monitor if these firms will pass their AI ethics policies. This will also help Microsoft evaluate some issues as to how a customer will use the app in society.

Furthermore, the tech giant will also stop selling facial analysis tools that pretend to infer emotions and moods related to human attributes.

"We collaborated with internal and external researchers to understand the limitations and potential benefits of this technology and navigate the tradeoffs," Microsoft's product manager Sarah Bird said.

Bird added that the company is aware of the technology and its impact on the end-users. Speaking of which, she mentioned that it could tackle other issues, including inaccuracies of the technology in determining facial moods and privacy among people.

Related Article: Microsoft, Meta to Help Google Secure Their Data Centers

Microsoft Restricts Neural Voice Tool

Apart from limiting the usage of AI-based facial detection software, the company is also restricting the use of neural voice technology. Microsoft is scraping the tool where deepfake voices are made.

According to the chief responsible AI officer, Natasha Crampton, this software can "improperly" mimic speakers, as well as trick listeners, per The Guardian. This is another risk that users could face since it can be used in carrying out malicious activities.

In another Microsoft-related news, the company is currently offering a special reward for people who want to use the Microsoft Edge. As per Tech Times' story, you can actually receive 330 Minecoins, Minecraft's in-game currency, just by enabling the Bing search engine on the browser for less than a week.

Read Also: This AI Thinks Like a Real Chemist After Solving a Protein-Folding Problem

This article is owned by Tech Times

Written by Joseph Henry 

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion