Skip to main content

D.O.J. Settlement: Facebook Agrees To Eliminate Discriminatory Housing Ads

Jun 23, 2022
facebook
Staff Writer

Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status and national origin.

KEY TAKEAWAYS
  • The complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the Fair Housing Act (FHA).
  • This is the DOJ’s first case challenging algorithmic bias under the FHA.
  • Meta Platforms has agreed to eliminate features in its advertising business that allows landlords, employers, and credit agencies to discriminate against groups of people by federal civil rights laws.
  • Facebook was excluding groups such as African Americans, mothers of high school kids, people interested in wheelchair ramps and Muslims from seeing advertisements.

The Department of Justice secured a groundbreaking settlement agreement with Meta Platforms, formerly known as Facebook, to resolve allegations of discriminatory advertising. 

The proposed agreement, announced June 21, resolves a lawsuit filed in the U.S. District Court for the Southern District of New York alleging that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status, and national origin. The settlement will not go into effect until approved by the court.

The complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the Fair Housing Act (FHA). This is the DOJ’s first case challenging algorithmic bias under the FHA.

This settlement also marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.

Meta Platforms has agreed to eliminate features in its advertising business that allows landlords, employers, and credit agencies to discriminate against groups of people protected by federal civil rights laws. 

ProPublica investigated the matter for nearly six years, and was the first to reveal that Facebook let housing marketers exclude African Americans and others from seeing some of their advertisements. Meanwhile, federal law prohibits housing, employment, and credit discrimination based on race, religion, gender, family status, and disability. 

The settlement was a result of a lawsuit brought by the Trump administration alleging Meta’s ad targeting system violated the Fair Housing Act. 

Under the settlement, Meta has agreed to stop using an advertising tool for housing ads (known as “Special Ad Audience” tool) that relies on a discriminatory algorithm. Meta also will develop a new system to address racial and other disparities caused by its use of personalization algorithm targeting tools and modify its delivery for housing ads in response to the lawsuit. 

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.” 

ProPublica reported the potential for advertising discrimination in 2016, but a year later Facebook was still excluding groups such as African Americans, mothers of high school kids, people interested in wheelchair ramps, and Muslims from seeing advertisements. It was also possible to target ads to anti-Semites that included options such as “How To Burn Jews'' and “Hitler Did Nothing Wrong.” Later, companies were also found posting employment ads that women and older workers could not see. 

“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, the principal deputy assistant secretary for Fair Housing and Equal Opportunity at the Department of Housing and Urban Development (HUD). “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD appreciates its continued partnership with the Department of Justice as they seek to uphold our country’s civil rights laws.”

Specifically, the department alleged that:

  • Meta enabled and encouraged advertisers to target their housing ads by relying on race, color, religion, sex, disability, familial status, and national origin to decide which Facebook users will be eligible and ineligible to receive housing ads.
  • Meta created an ad-targeting tool known as “Lookalike Audience” or “Special Ad Audience.” The tool uses a machine-learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser using several options provided by Facebook. Facebook has allowed its algorithm to consider FHA-protected characteristics — including race, religion, and sex — in finding Facebook users who “look like” the advertiser’s source audience and thus are eligible to receive housing ads.
  • Meta’s ad-delivery system uses machine-learning algorithms that rely in part on FHA-protected characteristics — such as race, national origin, and sex — to help determine which subset of an advertiser’s targeted audience will actually receive a housing ad.

Meta has until December 2022 to develop a new system for housing ads to address disparities for race, ethnicity, and sex between advertisers’ targeted audiences. The parties will select an independent third-party reviewer to investigate and verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties. 

Under the settlement, Meta must also pay a civil penalty of $115,054, the maximum penalty available under the Fair Housing Act.

“Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation,” said U.S. Attorney Damian Williams for the Southern District of New York.

"As part of this settlement, we will be building a novel machine learning method within our ads system that will change the way housing ads are delivered to people residing in the US across different demographic groups. While HUD raised concerns about personalized housing ads specifically, we also plan to use this method for ads related to employment and credit in the US." said a spokesperson for Meta. "This type of work is unprecedented in the advertising industry and represents a significant technological advancement for how machine learning is used to deliver personalized ads. We are excited to pioneer this effort."

About the author
Staff Writer
Katie Jensen is a staff writer at NMP.
Published
Jun 23, 2022
President Trump ‘Giving Very Serious Consideration’ To Re-Privatizing Fannie And Freddie

President indicates the time ‘would seem to be right,’ says he’ll make a decision ‘in the near future’

NAMB Applauds House Passage Of VA Home Loan Reform Bill

Legislation is 'a critical step' toward housing stability for veterans, group says

May 21, 2025
MaxClass, OCN In ‘NMLS Fest’ Joint Venture

Format merges live continuing education with business-building interactions with vendors

May 21, 2025
Mortgage Applications Drop As Rates Reach Three-Month High Point

Purchase apps still 13% higher than a year ago, despite latest weekly slide

May 21, 2025
Moody’s Downgrades Fannie And Freddie Following U.S. Sovereign Credit Cut

Outlooks for both GSEs revised from negative to stable

May 20, 2025
A&D Mortgage Completes $427M Non-QM Securitization

Company says transaction highlights expansion in the Non-QM market, notes it expects to price more deals this year

May 19, 2025