This story is over 5 years old.


How Big Data Could Discriminate

The Federal Trade Commission discussed how data might be be used to target minority groups for high-risk loans and credit cards.

Where you live and shop, what you buy, how much you spend, and other personal data is a boon to the financial industry. Big Data allows that industry to better target its advertisements to the people most likely to follow up on them, but when targeting takes into account things like race and income—as it so often does—the Federal Trade Commission believes that data could also be a tool for discrimination.


During a conference held yesterday in Washington, DC, called "Big Data: A Tool for Inclusion or Exclusion?" FTC Commissioner Julie Brill declared that regulatory agencies should shift their critical lens to what she described as the "unregulated world of data brokers." According to Brill, there is a "clear potential" for the profiles of low-income and racialized consumers built with personal data "to harm low-income and other vulnerable consumers."

At issue during the conference was the question of specific communities being targeted with advertising for loans and credit cards, and whether the data-driven consumer profiles used in this targeting were proxies for race and income status. Preliminary research carried out by Latanya Sweeney, Chief Technologist for the FTC and the founder and director of Harvard's Data Privacy Lab, suggests that this is exactly the case.

In a presentation, Sweeney outlined the results of a summer study conducted by the FTC. According to her research, which involved tracking credit card advertisements and the demographics of the sites they appeared on, ads for better cards were concentrated on sites with higher-income visitors, like that of Harvard Magazine.


"Domains with exclusive audiences do exist, and ads are not exempt from being delivered to those sites," she said. "So, the lack of ads or too much of another ad leading to a disparate impact and demographics could therefore infer what kind of advertising experience you might have."


An interesting case Sweeney discovered was the website of Omega Psi Phi, a largely black fraternity that counts civil rights heroes like Roy Wilkins and Benjamin Hooks among its past members. According to Sweeney, the fraternity's site was littered with ads for credit cards and services to search for arrest records.

"There are ads for a criminal lawyer and there were ads for credit cards," Sweeney explained. "Now, it turns out that the financial industry is the number one marketer online. They're the number one industry advertising online."

The targeting of low-income and non-white communities for disadvantageous loans (to them, not their creditors) is a dark bit of history that has followed us from the early half of the 20th century to the present. Racial profiling is not a new issue by any stretch of the imagination, especially when it comes to targeted advertising, but it's gained a new, more robust groove in the age of online data.

Domains with exclusive audiences do exist, and ads are not exempt from being delivered to those sites

Data, however big, does not exist in a vacuum, especially when it is embedded in an industry as entrenched as finance. And, historically speaking, finance doesn't have a great track record when it comes to racial equality.

George Lipsitz, in his book analyzing the historical buttressing of white supremacy through American policy, The Possessive Investment In Whiteness, described how Federal Housing Administration loan policies explicitly favoured whites up until 1948 (though they were allowed by the Supreme Court to continue after), effectively locking black families out of lucrative housing markets for generations. The subprime mortgage crisis of 2008 has also been linked to racial profiling, and a Princeton study from 2010 confirmed that race was a factor when it came to unscrupulous financial institutions giving high-risk, high-reward loans to people with low incomes.


As many of the panelists during yesterday's conference noted, targeted advertising is not a crime, though using preselection criteria like race and gender during the decision-making process itself is.

"In decision making for credit, [there are] long-standing prohibitions going back to 1974 on using, you know, marital status or race in the decision making," C. Lee Peeler, vice president of the Council of Better Business Bureaus, said. "In advertising, the traditions are the opposite. In advertising, it's necessarily about targeting your products to markets."

When it comes to advertising, however, trouble starts—legally speaking, under the Equal Credit Opportunity Act—once a "disparate impact" has been identified. That is, if a specific group is being harmed by not having a product advertised to them, or vice versa, as the case may be with non-white and low-income communities being targeted for high-risk loans and credit cards.


"There may be data inside big data sets that say with some level of confidence what are the demographic, you know, characteristics," Peter Swire, a professor of law and ethics at the Georgia Institute of Technology, said. "If you have that and you have a disparate impact in the data in your database, the history under fair lending has been that you might come under scrutiny for the regulated industries."

The tepid nature of Swire's comments largely reflected the tone of many of the legal experts during the conference. As Leonard Chanin, former head of the Consumer Financial Protection Bureau's rule-making team, noted, American legislation has a long history when it comes to credit, and Big Data is a very new industry. Many questions remain as to which laws apply, to whom, and under what circumstances.

"Along with figuring out what we think we ought to do, there's a legal research task about what the law has done," Swire said. "Talking among others to see what is really done there is something that I think would inform our debate about what the legal rules are."

Whatever the legal specifics of Big Data as it pertains to credit cards and loans, it's clear that targeted advertising online has the very real potential to act, in effect, as discrimination. Although data carries the suspicion-waiving mantle of "objectivity," it also carries the all-too-easily effaced weight of history.

Teasing out what this all means in legal terms is a huge challenge that regulatory bodies like the FTC are only beginning to address, and it promises to present myriad ethical, moral, and legal complexities.