As part of a settlement with the United States, Facebook has agreed to modify its algorithms to amend discriminatory housing advertisements. It also stipulates that Meta Platforms Inc., its parent company, will submit to judicial oversight, according to the U.S. Justice Department.

U.S. government officials announced in a press release that they had reached a settlement with Meta Platforms Inc. to withdraw a similarly-natured lawsuit that had been simultaneously filed in the Manhattan federal court. 

Facebook Demanded To Revise Algorithms After Violations of the Fair Housing Act

The press release also added that the lawsuit on algorithmic discrimination is the Justice Department's first case defending certain provisions in the Fair Housing Act. For their upcoming amendments to the current ad targeting and delivery system, Facebook will be required to seek first the approval of the U.S. Justice Department.

Facebook welcomed the demands of the agreement, adding that they are already "building a novel machine learning method" to amend the approach of their algorithm in serving housing ads. According to Ashley Settle, spokesperson of Facebook, the development will "change the way housing ads are delivered to people residing in the U.S. across different demographic groups."

Settle also declared that the business would use the same algorithmic framework that they implemented for their new credit and employment-related ads. U.S. government officials celebrated the agreement. U.S. Attorney Damian Williams and Assistant Attorney General Kristen Clarke added that the lawsuit is a "historic" and "groundbreaking" case.

Is This Facebook's First Violation of the Fair Housing Act?

This is not the first time, however, that Facebook has violated certain provisions of the Fair Housing Act. According to Williams, Facebook's technology used more traditional advertising methods in the past to engage in discriminatory advertising.

The Justice Department said that under the terms of the settlement, Facebook would need to stop using an advertising tool for housing ads, pushing the currently discriminatory algorithm to find a "lookalike audience" for certain users based on their traits, interests, behavior, and Facebook activity.

Facebook will need to phase out its "Lookalike Audience" function by December 31 to comply with the terms of the agreement. According to the U.S. government, that was the mechanism by which people were discriminated against on the basis of race, gender, and other factors.

In a legal settlement with a group that included the American Civil Liberties Union, the National Fair Housing Alliance, and others in March 2019, Facebook agreed to overhaul its ad-targeting systems to prevent discrimination in housing, credit, and employment ads. This announcement follows that agreement.

The 2019 settlement, according to the Justice Department, decreased the possibility of discriminatory targeting for advertisers. Still, it did not address other issues, such as Facebook's use of machine-learning algorithms to deliver discriminatory housing ads.

Facebook noted that they are planning on creating a new system over the following six months, focused on resolving ad-based racial brought about by implementing personalization algorithms in their current housing ad delivery system.

According to the Justice Department, the settlement agreement may be terminated if the new system is insufficient. Meta must also pay a fine of over $115,000 as per the settlement.

Related Article: Meta Extends Pause on Monetization Cuts With Creators for Both Facebook and Instagram Until 2024

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags: Facebook Meta
Join the Discussion