Tech

Researchers Find Facebook’s Ad Targeting Algorithm Is Inherently Biased

Ads for cashier positions in supermarkets reached an 85% female audience, and ads for positions in taxi companies reached a 75% Black audience.
Facebook ads.
Image via Shutterstock

Facebook is in trouble with the US Department of Housing and Urban Development (HUD) for what the department says are discriminatory ad targeting practices. For years, advertisers were allowed by Facebook to target (or avoid targeting) protected groups, like minorities and specific gender identities. But in a new paper, a team of researchers says that Facebook’s ad delivery algorithm is inherently biased even when advertisers are trying to reach a large, inclusive audience.

Advertisement

The paper, published by researchers at Northeastern University, the University of Southern California, and Upturn, a nonprofit organization focused on digital rights, claims that Facebook’s ad delivery algorithm is “skewed” to send housing and employment ads to specific demographic groups. Facebook believes that certain jobs, for example, are largely only “relevant” to white men, while others are delivered largely to black women, even when the advertiser is trying to target large, inclusive groups.

“We demonstrate that skewed [ad] delivery occurs on Facebook, due to market and financial optimization effects as well as the platform’s own predictions about the ‘relevance’ of ads to different groups of users,” the paper, published in the Arxiv preprint server, found. “We find that both the advertiser’s budget and the content of the ad each significantly contribute to the skew of Facebook’s ad delivery. Critically, we observe significant skew in delivery along gender and racial lines for ‘real’ ads for employment and housing opportunities despite neutral targeting parameters.”

“Our results demonstrate previously unknown mechanisms that can lead to potentially discriminatory ad delivery, even when advertisers set their targeting parameters to be highly inclusive,” the paper continued.

Last week, HUD charged Facebook with violating the Fair Housing Act for enabling discrimination in how it served housing ads on the platform. HUD found that Facebook was targeting users based on “race, color, religion, sex, familial status, national origin and disability," a practice which, under the Fair Housing Act (FHA), is illegal.

Advertisement

Facebook disputes HUD’s claims, saying that “HUD had no evidence and finding that our AI systems discriminate against people.”

Read more: Facebook Bans White Nationalism and White Separatism

In their study, which has not yet been peer reviewed, the researchers aimed to understand whether ad delivery could end up skewed based on demographics due to how the ad was delivered and packaged. If an advertiser didn’t explicitly choose to target a specific group, would Facebook still make targeting choices for them?

To test this, they ran dozens of ad campaigns—including hundreds of ads that reached millions of people—and spent over $8,500 in the course of the study.

“When it comes to discriminatory online advertising, the targeting options available to advertisers are only the tip of the iceberg,” a spokesperson for Upturn told Motherboard. “Our research demonstrates that ad platforms like Facebook can deliver ads to audiences skewed by race and gender even when advertisers target large, inclusive audiences.”

The researchers believe that factors like ad spend, images used in the ads, and headlines associated with it will influence who ultimately sees it.

By experimenting with “base” ads with blank imagery or headlines and adding elements or various links back in, they could determine which parts of an ad were impacting who saw it. For example, when they used ads that only linked to sites like bodybuilding.com (traditionally male-targeted) or elle.com (traditionally female), the biases toward genders hovered near half-and-half. But when they added images—of a weightlifting man, and a set of makeup brushes, respectively—these skewed to over 75% male for bodybuilding, and over 90% female for cosmetics.

Advertisement

While gender profiling is concerning, they also saw additional, real-world implications of some of their experiments. When the researchers created five ads for jobs in the lumber industry and tried to send them to a broad audience, the ad was delivered “to over 90 percent men and to over 70 percent white users in aggregate,” while five ads for janitors were delivered “to over 65 percent women and over 75 percent black users in aggregate.” According to the study, Facebook also delivered their broadly-targeted ads for houses for sale to audiences of 75% white users, while rental ads went to a more balanced group.

“Our ads for cashier positions in supermarkets reach an 85% female audience, and our ads for positions in taxi companies reach a 75% Black audience,” the researchers wrote.

These findings help explain some of the allegations made in HUD’s charge against Facebook. HUD’s court filing stated that Facebook discriminated not only by allowing advertisers to target ads according to race, gender, and other protected characteristics, but that its ad delivery algorithm is also biased:

Even if an advertiser tries to target an audience that broadly spans protected class groups, Respondent’s ad delivery system will not show the ad to a diverse audience if the system considers users with particular characteristics most likely to engage with the ad. If the advertiser tries to avoid this problem by specifically targeting an unrepresented group, the ad delivery system will still not deliver the ad to those users, and it may not deliver the ad at all. This is so because Respondent structured its ad delivery system such that it generally will not deliver an ad to users whom the system determines are unlikely to engage with the ad, even if the advertiser explicitly wants to reach those users regardless.

Advertisement

Joe Osborne, a Facebook spokesperson, issued a statement to Motherboard following the publication of the paper:

“We stand against discrimination in any form. We’ve announced important changes to our ad targeting tools and know that this is only a first step. We’ve been looking at our ad delivery system and have engaged industry leaders, academics, and civil rights experts on this very topic – and we're exploring more changes.”

The study raises the issue, yet again, of inherent bias programmed into opaque algorithms that few people understand and that few (if any) impartial researchers have any real way of vetting.

“Our findings underscore the need for policymakers and platforms to carefully consider the role of the optimizations run by the platforms themselves—and not just the targeting choices of advertisers—in seeking to prevent discrimination in digital advertising,” the researchers wrote.

Updated 9 p.m. EST with a statement from Facebook.