The U.S. Department of Justice announced that it reached a settlement with Meta Platforms (formerly known as Facebook) resolving allegations that the company engaged in discriminatory advertising in violation of the Fair Housing Act.  

In the DOJ's first case challenging algorithmic bias under the FHA, the DOJ alleged that three aspects of Facebook's ad targeting and delivery system were discriminatory.  The DOJ alleged in its complaint that:

  • Facebook enabled and encouraged advertisers to target their housing ads by relying on race, color, religion, sex, disability, familial status, and national origin to decide which Facebook users will be eligible and ineligible to receive housing ads; 
  • Facebook create an ad targeting tool known as "Lookalike Audience" or "Special Ad Audience," which uses a machine-learning algorithm to find Facebook users who share similarities with groups of individuals selected by an advertiser, and that this algorithm considers FHA-protected characteristics, such as race, religion, and sex; and
  • Facebook's ad delivery system uses machine-learning algorithms that rely in part on protected characteristics, such as race, national origin, and sex, to help determine who will receive housing ads. 

In a statement, Assistant Attorney General Kristen Clarke, said, “As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner.  This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit.  The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”  

As part of the settlement, which must still be approved by the court, Facebook agreed to stop using its "Special Ad Audience" tool by the end of the year and to develop a new system for housing ads that addresses disparities for race, ethnicity, and sex.  Significantly, the new system is subject to government approval, and if the DOJ concludes that it doesn't address the issues, the settlement agreement will terminate and the government will be free to continue with its lawsuit against the company.  Facebook also agreed that it won't provide any targeting options for housing advertisers that describe or that relate to FHA-protected characteristics. The settlement also includes ongoing monitoring and notification provisions, and a civil penalty of $115,054, which is the maximum penalty available under the FHA.