FYI.

This story is over 5 years old.

News

How Child Protection Agencies Are Trying to Predict Which Parents Will Abuse Kids

America is about to find out what happens when you mix big data with so-called risk factors like race.

Photo by Allen Donikowski/Getty Images

For some parents, the words "Child Protective Services" send chills down the spine. In an effort to stamp out fraud and abuse, welfare outfits across the country have honed in on poor communities of color for decades now, critics claim, with almost any government interaction potentially leading to kids being removed from their homes.

Now big data is set make the dynamic even more intense—and racially charged.

Advertisement

In 1999, the foster care population in the United States reached a peak at 567,000. But due to both diminishing budgets and programs aimed at keeping children in their homes, that number dropped to 415,000 by 2014. The representation of black children in the foster system remains disproportionately high, however: Black children account for 24 percent of kids in foster care, while comprising just 14 percent of the general population of children in the US. And a burgeoning method for determining exactly which families get visits from child welfare caseworkers has advocates for low-income families worried the disparity will only get worse.

The new approach is called "predictive analytics," and it's taking the child welfare system by storm. Across the country, from suburban counties in Florida to major cities like Los Angeles, child welfare agencies are launching initiatives that take data points like race, parental welfare status, and criminal history, and a variety of other publicly available characteristics, and feed them into an algorithm that assigns each child a "risk" score. That score is then considered when determining whether a caseworker should visit a family.

The government officials spearheading the predictive analytics movement argue that it will help bring necessary services to vulnerable families and save lives. Civil rights advocates counter that it will just reinforce preexisting biases against low-income families who depend on welfare programs to pay for housing and food, and are already subjected to a bevy of bureaucratic conditions to qualify for said benefits.

Advertisement

Marc Cherna, for one, believes that used correctly, predictive analytics can change the game when it comes to protecting young kids in America. He's the director of the Department of Human Services in Pennsylvania's Allegheny County, and under his watch, the county has reduced the number of children in foster care from 3,000 in 1996 to just over 1,000 children this year.

Cherna's agency was one of the first in the country take advantage of big data to help improve outcomes for families in the county, focusing on indicators that might help social workers keep families together by intervening early. County officials opened a "data warehouse" in 1999, which helped collect benefit and criminal records, as well as housing history. This summer, it will be one of the first counties in the United States to use predictive analytics with the launch of a program that helps screen which reports of abuse should to be acted on. Every time a report of abuse comes into the county, that case will be given a risk score; the higher the risk score, the more likely a caseworker will be sent out.

"Our practice has been that half of the calls we get, end up getting screened out," Cherna explained in an interview, adding that "screened-out" calls result in no immediate action being taken by the county. "When we looked at cases that led to fatalities or near fatalities and ran it through the [predictive analytics] system, almost all of them were very high risk. But some of those cases were screened out, four out of five never came to our attention, which shows us that there are cases that we're missing."

Advertisement

He noted that running previous calls through the system showed that the county has been investigating parents that have a very low "risk" of abusing their kids, according to the algorithm. Basically, the data suggest officials could be doing things a lot more efficiently.

"There's nothing in the predictive analytics model that our workforce doesn't already have access to in the descriptive way," said Erin Dalton, a human services official who's spearheading the predictive analytics work for Allegheny County. "What this does is help to reduce variation in decision-making and make it so that similarly risked kids are treated similarly."

Allegheny County is taking several steps to monitor the effectiveness of its program, including building out an independent evaluation tool to make sure the predictive analytics project is actually working. And officials have promised safeguards ad training to protect against prejudice. Dalton believes this is a much better system than those adopted by other counties, where private companies are poised to run predictive analytics without much in the way of independent evaluation. In Los Angeles, the Department of Children and Family Services—the largest such agency in America—is using software designed by SAS, the world's leading analytics software developer. But evaluations of that program's effectiveness have, so far at least, been done by the same company.

Advertisement

But even with the precautions Allegheny County is taking, critics argue predictive analytics will eventually lead to Minority Report–style monitoring, where families are investigated before any abuse occurs. "What you've got is computerized racial profiling," said Richard Wexler, executive director of the National Coalition for Child Protection Reform. "While that's not the intent of a lot of the people pushing this, that is how it winds up being used."

Wexler pointed to a recent federal report from the Commission to Eliminate Child Abuse and Neglect Fatalities, which trumpeted predictive analytics as a way to eliminate child fatalities by aggressively pursuing action involving "high risk" children. The commission recommended the national use of predictive analytics. And private companies are surely looking to get in on the action: One recent study projected the global predictive analytics market (of which US child services is obviously just a tiny fraction) will grow from $2.74 billion in 2015 to $9.2 billion by 2020.

When it comes to protecting kids, the technology is still in its infancy, but there some hints as to where it might go in the near future. In a Powerpoint presentation given to community members last month, Allegheny County officials demonstrated some possible uses of predictive analytics. One of them was a tool where every child born in the county was given a "needs" score at birth, based on their family history, race, and several other factors. These scores would then put the children on the radar of the Human Services department, which might offer voluntary services at birth or proactively reach out to families in the months following. To communities that have long endured extensive oversight by child welfare bureaucracies, the very idea of putting a "needs" score on a newborn is cause for alarm—especially if one of the factors determining that score is race.

Advertisement

"At so many of the decision points, the race of the child ends up mattering in terms of whether the family is going to be brought to court or a child is placed in foster care," said Kathleen Creamer, an attorney who works with families navigating with the child welfare system in Philadelphia. "If we're saying African American children are way more likely to end up in foster care, that means when caseworkers do a safety assessment at a home, they're just going to presume they're at a higher risk—mostly because the decision-making at the front end of these cases reflects such an implicit bias."

Proponents in Allegheny County insist they know the dangers here, and that the new technology should be given a chance to work. "I understand that people are afraid that if someone has a high score, we're going out to remove these kids. That is not the intent," said Cherna, the county official. "We are training that that doesn't happen."

But longtime advocates for marginalized kids and their families say the writing is on the wall.

"No caseworker wants to be on the front page of the newspaper as the [one] who overruled the computer if something goes wrong with that family," Wexler said.

Follow Max Rivlin-Nadler on Twitter.