Debt Collectors Want To Use AI Chatbots To Hustle People For Money

The collections industry is pushing GPT-4 as a dystopian new way to make borrowers pay up, replicating the debt system’s long history of racial bias.
Getty Images

An industry historically known for harassing and intimidating people down on their luck is replacing human debt collectors with AI-powered ones, in a move that one company says will "change debt collections forever."

AI tools will take the industry into “a new era of debt collections,” according to New York and Bangalore-based A digital voice agent—the new incarnation of a robocall, using AI chatbots and text-to-speech capabilities for dynamic, responsive conversations—could make millions of outbound calls in just a few days, the company claimed, contacting and requesting payment from a collection agency’s entire portfolio of debtors at a far lower cost than human staff.


Human agents could use AI at every stage of the collection process, the company’s blog claims, delivering “instant scalability” through “end-to-end automation,” which of course would boost productivity and lower costs. On the other hand, for someone on the other end of the call, the chance to speak with an actual human during the process becomes ever more distant. did not respond to Motherboard’s questions about using AI for debt collection. Increasingly though, software services marketed to debt collectors are starting to incorporate machine learning and even generative AI, with the promise of optimizing the recovery of funds from debtors. And at a time when AI hype is booming and debts are at an all-time high, these uses are only likely to grow.

The prospect of automated AI systems making phone calls to distressed people adds another dystopian element to an industry that has long targeted poor and marginalized people. Debt collection and enforcement is far more likely to occur in Black communities than white ones, and research has shown that predatory debt and interest rates exacerbate poverty by keeping people trapped in a never-ending cycle. 

In recent years, borrowers in the US have been piling on debt. In the fourth quarter of 2022, household debt rose to a record $16.9 trillion according to the New York Federal Reserve, accompanied by an increase in delinquency rates on larger debt obligations like mortgages and auto loans. Outstanding credit card balances are at record levels, too. The pandemic generated a huge boom in online spending, and besides traditional credit cards, younger spenders were also hooked by fintech startups pushing new finance products, like the extremely popular “buy now, pay later” model of Klarna, Sezzle, Quadpay and the like.


So debt is mounting, and with interest rates up, more and more people are missing payments. That means more outstanding debts being passed on to collection, giving the industry a chance to sprinkle some AI onto the age-old process of prodding, coaxing, and pressuring people to pay up.

For an insight into how this works, we need look no further than the sales copy of companies that make debt collection software. Here, products are described in a mix of generic corp-speak and dystopian portent: SmartAction, another conversational AI product like Skit, has a debt collection offering that claims to help with “alleviating the negative feelings customers might experience with a human during an uncomfortable process”—because they’ll surely be more comfortable trying to negotiate payments with a robot instead. 

Meanwhile, Latitude ”resolves gaps in functionality while reducing the pressure on your agents and increasing recovery rates”; Katabat provides ”full omni-channel orchestration, true machine learning” and a “powerful collection strategy engine”; and TrueAccord runs an ”industry-leading recovery and collections platform powered by machine learning and a consumer-friendly digital experience.” TrueAccord also boasts of offering more empathetic debt collections experiences, which naturally are achieved through the hallmarks of compassion: “experimentation in A/B testing consumer research, and machine learning.”


It’s the same core promise as so many AI-powered products: do more, faster, with fewer humans in the loop; feed the data you collect back into the system; tweak, refine, and repeat for as long as you need.

In an email to Motherboard, Timnit Gebru, founder of the Distributed AI Research Institute (DAIR), described the use of AI in debt collection as "punishing those who are already struggling."

"In a time when income inequality is off the charts, when we should be reducing things like student debt, are we really trying to build tools to put even more pressures on those who are struggling? This would be true even if the software was working as intended," Gebru said. 

"In addition to this, we know that there are so many biases that these LLM based systems have, encoding hegemonic and stereotypical views,” Gebru added, referring to the findings of the paper on large AI models that she co-authored with several other researchers.  “The fact that we don't even know what they're doing and they're not required to tell us is also incredibly concerning."

Some of the companies that stand to benefit most from AI integration are those that purely exist to collect debt. These companies, known as debt buyers, purchase “distressed” debt from other creditors at steep discounts—usually pennies on the dollar—then try as hard as they can to get debtors to repay in full. They don’t issue loans, or provide any kind of service that clients might owe them for; it’s a business model built on profiting from people who fell behind on payments to someone else. They also rely heavily on the civil court system, which some experts believe will soon be flooded by AI-generated debt lawsuits.


Many vendors of debt collection software advertise their products to these third-party buyers, but one company,, is one of the few that explicitly mentions large language models at the same time. As of now, the digital collections platform claims to integrate GPT-3, but clearly has its sights set on the newer update: a recent blog post gushes over the exciting world of GPT-4 Debt Collection, which will apparently be more personalized, efficient, and emotionally intelligent when asking people to cough up on their bills. 

“Striking the right balance between assertiveness and empathy is a significant challenge in debt collection,” the company writes in the blog post, which claims GPT-4 has the ability to be “firm and compassionate” with customers.

A graph measuring "Volume of Collections," showing a lower bar labeled "Traditional Methods" and a higher bar labeled with a company's logo.

Image via

When algorithmic, dynamically optimized systems are applied to sensitive areas like credit and finance, there’s a real possibility that bias is being unknowingly introduced. A McKinsey report into digital collections strategies plainly suggests that AI can be used to identify and segment customers by risk profile—i.e. credit score plus whatever other data points the lender can factor in—and fine-tune contact techniques accordingly. 

Odette Williamson, a senior attorney at the National Consumer Law Center, says AI models can pick up systemic bias from training data that reflects the long history of lending discrimination against low-income groups and communities of color.


“Is the [training] data inaccurate and misleading, or is it complete?” Williamson said. “Given our racist history in the US…will this data reflect discriminatory trends, and if so how does that impact decisions that a system would make down the line about who to target and how aggressive to be in debt collection?”

A real-world example of the harms of automated debt collection came to Australia in 2016, when the government overpaid on welfare payments and then used an automated system to try and get the money—with disastrous results

An awareness of biased outcomes—which have been documented in areas from prison sentencing to predicting school dropout rates to monitoring welfare fraud—necessitates careful auditing, Williamson said.

“At the end of the day we’ve got to make sure that these models are statistically sound, that they’re tested at every stage of development and also when they’re deployed… and if models are displaying discriminatory or biased outcomes and you can’t fix it, then they just shouldn’t be used,” she said.

Regulatory agencies are aware, in a general sense, that AI poses many possible risks to consumer finance. Just a few weeks ago, in late April, the Consumer Financial Protection Bureau (CFPB) issued a joint statement with the Department of Justice, Federal Trade Commission, and Equal Employment Opportunity Commission stating an intent to target discriminatory practices that emerge through the use of automated systems. 

Reached by email, a CFPB spokesperson did not confirm whether AI-powered debt collection systems were an area of specific concern, but told Motherboard: “Regardless of the type of tools used, the CFPB will expect debt collectors to comply with all Fair Debt Collection Practices Act requirements and the Consumer Financial Protection Act’s prohibitions against unfair, deceptive, and abusive practices.”