D.O.J. Settlement: Facebook Agrees To Eliminate Discriminatory Housing Ads
D.O.J. Settlement: Facebook Agrees To Eliminate Discriminatory Housing Ads
Abstract
Industry News D.O.J. Settlement: Facebook Agrees To Eliminate Discriminatory Housing Ads Katie Jensen Jun 23, 2022 Meta's housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status and national origin. KEY TAKEAWAYS The complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the Fair Housing Act. The complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads, and that those algorithms rely, in part, on characteristics protected under the Fair Housing Act. Under the settlement, Meta has agreed to stop using an advertising tool for housing ads that relies on a discriminatory algorithm. "It is not just housing providers who have a duty to abide by fair housing laws," said Demetria McCain, the principal deputy assistant secretary for Fair Housing and Equal Opportunity at the Department of Housing and Urban Development. " Specifically, the department alleged that: Meta enabled and encouraged advertisers to target their housing ads by relying on race, color, religion, sex, disability, familial status, and national origin to decide which Facebook users will be eligible and ineligible to receive housing ads. Meta's ad-delivery system uses machine-learning algorithms that rely in part on FHA-protected characteristics - such as race, national origin, and sex - to help determine which subset of an advertiser's targeted audience will actually receive a housing ad. Meta has until December 2022 to develop a new system for housing ads to address disparities for race, ethnicity, and sex between advertisers' targeted audiences.