Associated Press

By Larry Neumeister,
The Associated Press

Facebook will change its algorithms to prevent discriminatory housing advertising and its parent company will subject itself to court oversight to settle a lawsuit brought by the U.S. Department of Justice on June 21.

In a release, U.S. government officials said it had reached agreement with Meta Platforms Inc., formerly known as Facebook Inc., to settle the lawsuit filed simultaneously in Manhattan federal court.

According to the release, it was the Justice Departmentโ€™s first case challenging algorithmic discrimination under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its ad targeting and delivery system.

U.S. Attorney Damian Williams called the lawsuit โ€œgroundbreaking.โ€ Assistant Attorney General Kristen Clarke called it โ€œhistoric.โ€

Ashley Settle, a Facebook spokesperson, said in an email that the company was โ€œbuilding a novel machine learning method without our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographic groups.โ€

She said the company would extend its new method for ads related to employment and credit in the U.S.

โ€œWe are excited to pioneer this effort,โ€ Settle added in an email.

Williams said Facebookโ€™s technology has in the past violated the Fair Housing Act online โ€œjust as when companies engage in discriminatory advertising using more traditional advertising methods.โ€

Clarke said โ€œcompanies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner.โ€

According to terms of the settlement, Facebook will stop using an advertising tool for housing ads that the government said employed a discriminatory algorithm to locate users who โ€œlook likeโ€ other users based on characteristics protected by the Fair Housing Act, the Justice Department said. By Dec. 31, Facebook must stop using the tool once called โ€œLookalike Audience,โ€ which relies on an algorithm that the U.S. said discriminates on the basis of race, sex and other characteristics.

Facebook also will develop a new system over the next half-year to address racial and other disparities caused by its use of personalization algorithms in its delivery system for housing ads, it said.

If the new system is inadequate, the settlement agreement can be terminated, the Justice Department said. Per the settlement, Meta also must pay a penalty of just over $115,000.

The announcement comes after Facebook already agreed in March 2019 to overhaul its ad-targeting systems to prevent discrimination in housing, credit and employment ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair Housing Alliance and others.

The changes announced then were designed so that advertisers who wanted to run housing, employment or credit ads would no longer be allowed to target people by age, gender or zip code.

The Justice Department said Tuesday that the 2019 settlement reduced the potentially discriminatory targeting options available to advertisers but failed to resolve other problems, including Facebookโ€™s discriminatory delivery of housing ads through machine-learning algorithms.

Help us Continue to tell OUR Story and join the AFRO family as a member โ€“ subscribers are now members!ย  Joinย here!ย