This article appears in VICE Magazine's Algorithms issue, which investigates the rules that govern our society, and what happens when they're broken.In a typical fictional dystopia, one might find a few common features: a bureaucratic government, a malevolent computer program, and an isolated and fearful public.A version of this scenario took place in Australia between 2016, when the conservative-leaning government automated its system for raising debts against people who had received government assistance and allegedly been overpaid, and this year, when the government pledged to pay back $721 million that it stole from nearly 400,000 of the country's most vulnerable people. In total, more than 700,000 Australians received letters notifying them that the system had identified a debt that they owed, unless they could prove otherwise.
Advertisement
The scope of the crisis, often referred to as "robodebt," is immense. Thousands were hounded by the government and debt collectors for alleged overpayments, often from years prior and amounting to thousands of dollars, that simply did not exist. Many victims paid up, some appealed their debt, and trauma was visited upon a population. Tragically, some families attributed their loved ones' deaths by suicide to recieving robodebts, something the head of Australia's Department of Social Services denies happened to this day.The program continued for years, despite scathing government hearings and reports. During this time, the volunteer-led campaign #NotMyDebt elevated victims' stories, activists organized a sit-in at a lawmaker's office, and a prominent legal scholar who was pushed out of their longstanding post in Australia's appeals tribunal for welfare payments lambasted the program. Legal aid groups launched successful challenges, and an ongoing class action lawsuit seeking damages was struck up in 2019.This year, the government admitted error and the unlawful nature of nearly 400,000 debts and begun paying back the jaw-dropping amount of money that was bilked from citizens, something that promises to be a complex process. Now, Australia has to reckon with the aftermath of a disastrous implementation of automation at the intersection of austerity and unaccountable government. Here, in the words of those who were there—whether as activists, insiders, or having received a debt—is how this all happened, and what the world can learn about preventing it from happening again.
Advertisement
'THEY COULDN'T PAY FOR CHRISTMAS PRESENTS'
Advertisement
JACKSON: I just expected we would get 20 stories in that first month. We got 300 and about $3.5 million dollars worth of debt, just a huge volume. The stories were awful. And I guess the only thing that we could do was amplify their voices and let people feel like they were being heard.TERRY CARNEY, professor emeritus at the University of Sydney, and former member of Australia's administrative appeals tribunal for welfare payments: I got my first case to deal with on the tribunal in, I think it was about early February.It didn't cut the mustard, or even approach having mustard on the table. I mean, they were totally unable to provide any shred of a legal foundation. And so in late March, my decision was made finding the debt not to be lawful, and therefore I set it aside.My estimate is that at least 200 or so other decisions, in addition to my five, were made by my colleagues on a similar basis.JACKSON: [Within] a year, a few thousand of these letters would have been sent out.By all accounts, the software underpinning the OCI scheme was extremely crude. It worked by matching data between Centrelink, the agency that processes government assistance, and the tax office, to flag discrepancies. This data matching was previously done with human oversight, but robodebt automated this to generate debt notices. It also averaged people's reported fortnightly incomes over an entire year to calculate their benefits, resulting in false debts being raised against citizens with irregular incomes. The system was iterated upon over the years; for example, in 2018, "predictive analytics" were added to resolve discrepancies likely to result in zero or low debt.CARNEY: Debts have always been a significant part of the caseload for the tribunal. Up until the change that occurred, there was data matching. But the data matching was used in order to provide a basis for making inquiries, rather than the be all and end all in seeking to establish the debt.
'CLUNKY, CRAP COMPUTER SYSTEM'
Advertisement
WOLF: The calculator that they used, the algorithm that they used, for figuring this out is COBOL [a programming language from 1959]. It's COBOL and Excel mashed together.JACKSON: The algorithm was crude data matching. They talk about it like it was a sophisticated thing. It was a really clunky, crap computer system that's actually multi-layered, matching with [tax office] data, but not using unique identifiers, because they didn't have them, they couldn't match that between the systems. So It was matching things like people's names and businesses and just causing all of these errors because it was done so crudely.That was part of the human oversight process that [used to] happen, because those people knew what to look for.CARNEY: Prior to robodebt, data matching led to debts being raised in 7 percent of all of the situations where it appeared that there might be some discrepancy… The reason that the other 93 percent were not pursued was stated on the public record because it wasn't cost effective to do so… What changed with robodebt is that the data matching was essentially automated to produce the letter, which didn't say technically that it was a debt, but it said this would be a debt, unless you can prove that this calculation is not correct.DEAN FLETCHER, Wollongong/Illawarra branch coordinator for the Australian Unemployed Workers' Union: It became an issue for me in 2017, when I got a debt notice and my partner [did] as well.When I got the letter from the debt collectors, [I felt] overwhelming anxiety because it didn't feel like I'd been given an opportunity not only to look into it myself, but to challenge it before they sent people after me.
'A TRAVESTY'
Advertisement
It was very confusing and, honestly, very isolating… If it wasn't for people like Asher and the #NotMyDebt campaign, I would have had no idea that other people were experiencing this as well.CARNEY: It was the sheer scale of the damage and harm that was being done to such a large number of people which dwarfed anything that had happened in social security in the history of the last century.WOLF: How do you end up with billions of dollars being stolen from people, so that people commit suicide? So that people beat their children, or beat their wives, or go hungry, or become malnourished, or lose their house, or sell the family car? Terrible things happened.JACKSON: Thousands of people have been traumatized. They haven't passed away, but they have been traumatized because they have been stressed, and they have lost homes, and they have not been able to have food on their table.CARNEY: We pay very low rates of Social Security in Australia at best, and we have high rates within that population of people with mental illness and other vulnerabilities.$721 billion, three-quarters of a billion dollars of unlawful debts, or taxes, if you like, were being imposed on these most vulnerable people. Three-quarters of a billion [dollars], nearly 400,000 people. In some cases, a person had more than one such debt. That's a travesty. It's among the most egregious actions that our government can be involved in.
Advertisement
'BURN SHIT DOWN'
Advertisement
WOLF: [Debts are] still being raised. Not using the same data matching form, but still being raised.And then it's still a huge number of people who are not in the class action… They haven't applied for any of this.FLETCHER: I think it's really endemic of much larger issues, number one: how Australian society treats welfare recipients.JACKSON: [Some] think that this same [automation] technology is like a panacea, and something that will bring fast problems to complex issues. It's really quite dangerous.WOLF: We dehumanize people through algorithms, and that will always lead to disastrous outcomes as long as we don't have accountability measures that are both specific and focused on a human element of accountability. There was no accountability within the system. Everybody was atomized. Nobody knew who to contact. Nobody knew that it was happening to other people. And the only way to fight these systems is to band together. So, you have to find other people that you can talk to about it.If algorithms are hurting your people, killing your people, burn shit down.Follow Jordan Pearson on Twitter.