Tech

After Years of Perpetuating Housing Discrimination, Facebook Gets $115,055 Fine

Facebook is no longer allowed to use biased algorithms to target discriminatory housing ads under a settlement with the DOJ.
GettyImages-1340970777
Image: Getty Images

Facebook agreed to stop discriminatory targeted advertising practices on Tuesday in a settlement with the Department of Justice (DOJ). The settlement claimed that the company’s housing ad algorithms violated the Fair Housing Act (FHA), which prohibits housing discrimination on the basis of race, sex, disability, familial status, and national origin. 

Under the settlement, which still needs to be approved by a judge in the Southern District of New York, where a lawsuit against the company was originally filed, Meta is required to develop a new system for housing ads by the end of 2022 and pay a fine of $115,055, the maximum penalty available under the Fair Housing Act. 

Advertisement

The fine itself is less than a slap-on-the-wrist for one of the world's largest companies, and is less than Facebook pays an entry level employee annually. But the settlement is notable because it is the first time a company has been sanctioned for algorithmic bias under the Fair Housing Act. 

The settlement also means that Facebook can no longer use the specific ad-targeting algorithm, which targeted (or specifically didn't target) people based in part on protected characteristics. 

According to the DOJ’s press release, “This settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.” 

“The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities,” Assistant Attorney General Kirsten Clarke of the Justice Department’s Civil Rights Division said in the same statement.

This settlement is the latest in a series of investigations against Facebook for its discriminatory ad systems. In 2018, the U.S. Department of Housing and Urban Development filed a complaint against Facebook, alleging that it violated the Fair Housing Act by allowing advertisers to exclude users by gender, disability, religion, and ZIP codes. 

Advertisement

A 2016 ProPublica investigation found that Facebook gave “advertisers the ability the ability to exclude specific groups it calls ‘Ethinic Affinities.’” In an example ad, ProPublica was able to exclude anyone with “an ‘affinity’ for African-American, Asian-American or Hispanic people.’” This was and is a highly controversial practice, and contributes to "digital redlining," which replicate redlining laws that have traditionally been used to segregate Black communities.

Though then-COO Sheryl Sandberg released a statement in 2017 stating that Facebook would disable the “Ethnic Affinities” feature, the company's advertising system continued to exhibit discriminatory features. In 2019, HUD followed up with its 2018 complaint by suing Facebook, alleging that it “unlawfully discriminates based on race, color, national origin, religion, familial status, sex, and disability by restricting who can view housing-related ads on Facebook's platforms and across the internet.” 

Advertisement

In the settlement released on Tuesday, the DOJ detailed three aspects of Facebook’s ad delivery system that violate the Fair Housing Act. These include “Trait-Based Targeting”, which encourages advertisers to target ads by including or excluding Facebook users based on FHA-protected characteristics, “‘Lookalike’ Targeting”, which is an algorithm designed to find users who ‘look like’ an advertiser’s ‘source audience’, and “Delivery Determinations”, which is an algorithm that determines which subset of the targeted audience will receive the ad.

Following the announcement of the settlement, Meta published a blog post detailing the changes it plans to make to its ad system. Meta said it has been working with HUD to develop a new machine learning algorithm that will “ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” 

A Facebook spokesperson told Motherboard that “this type of work is unprecedented in the advertising industry and represents a significant technological advancement for how machine learning is used to deliver personalized ads.”

The algorithm under development will also be applied to ads relating to employment and credit, as Civil Rights groups and researchers push for a reckoning against the inherent biases built into major internet platforms, such as Facebook. A 2020 Carnegie Mellon study confirmed the HUD’s findings, concluding that Facebook’s biased algorithms lead to exacerbated socioeconomic inequalities. 

Meta has agreed to a third-party review of its new system to determine whether it is meeting the standards agreed to in the settlement. If the DOJ concludes that Meta’s changes do not adequately resolve the system’s discriminatory practices, the settlement will be terminated and Meta will be brought to litigation in federal court.