Child welfare decisions should not be made by computer algorithms

Illinois wisely stopped using algorithms in child welfare cases, but at least 26 states and Washington, D.C., have considered using them, and at least 11 have deployed them. A recent investigation found they are often unreliable and perpetuate racial disparities,

SHARE Child welfare decisions should not be made by computer algorithms
Workers field calls at an intake call screening center for the Allegheny County Children and Youth Services office in Penn Hills, Pa. on Thursday, Feb. 17, 2022.

Workers field calls at an intake call screening center for the Allegheny County Children and Youth Services office in Penn Hills, Pa.

Keith Srakocic/AP Photos

The power of computers has become essential in all our lives. Computers, and specifically computer algorithms, largely make all of our lives easier.

Simply put, algorithms are nothing more than a set of rules or instructions used by computer programs to streamline processes — from internet search engines to programming traffic signals and scheduling bus routes. Algorithms influence and help us all in ways that we don’t often realize.

However, it is imperative that we realize that algorithms, like any computer program, are designed by humans and thus will have the same biases as the humans who designed them. This fact may be benign when it comes to searching for the best pizza place in Chicago on Google, but can be dangerous when relied on for serious matters. 

Yet, several states are now relying on algorithms to screen for child neglect under the guise of “assisting” child welfare agencies that are often over-burdened with cases — and a market once estimated to be worth $270 million to these companies.

Opinion bug

Opinion

Who among us would allow a computer to decide the fate of our children? 

A recent report from the Associated Press and the Pulitzer Center for Crisis Reporting has pointed out several concerns regarding these systems, including that they are not reliable — sometimes missing serious abuse cases — and perpetuate racial disparities in the child welfare system. Both outcomes are exactly what the creators of these systems often profess to combat.

The children and families impacted most by child welfare agencies are largely poor, and largely members of minority groups. Translation: They are the most powerless people in America, which is all the more reason for more privileged citizens to speak up and speak out against using algorithms to make critical decisions in child welfare cases.

In Illinois, the state’s Department of Children and Family Services used a predictive analytics tool from 2015 to 2017 to identify children reported for maltreatment who were most at risk of serious harm or even death. But DCFS ended the program after the agency’s then-director said it was unreliable.

While Illinois wisely stopped using algorithms, at least 26 states and Washington, D.C., have considered using them, and at least 11 have deployed them, according to a 2021 ACLU white paper cited by AP.

The stakes of determining which children are at risk of injury or death cannot be higher, and it is of vital importance to get this right. It is also important to realize that the same system that determines whether a child is at risk for injury or death often separates families.

It is easy for outsiders to say things like “better safe than sorry.” However, it is not a small point to realize that once a child or family comes into contact with an investigator, the chance of that child being removed and the family separated is increased. Simply put, the road to separation should not be initiated by computers that have proven to be fallible. 

The AP report also found that algorithm-based systems flag a disproportionate number of Black children for mandatory neglect investigations and gave risk scores that social workers disagreed with about one-third of the time.

California pursued using predictive risk modeling for two years and spent nearly $200,000 to develop a system, but ultimately scrapped it because of questions about racial equity. Currently, three counties in that state are using it.

Sadly, the demand for algorithmic tools has only increased since the pandemic. I fear that more and more municipalities will turn to them for child welfare issues without vetting them for problems, and without investigating conflicts of interest with politicians.

This technology, while no doubt helpful in many aspects of our lives, is still subject to human biases and simply not mature enough to be used for life-altering decisions. Government agencies that oversee child welfare should be prohibited from using algorithms.

Jeffery M. Leving is founder and president of the Law Offices of Jeffery M. Leving Ltd., and is an advocate for the rights of fathers.

Send letters to letters@suntimes.com

The Latest
Oscar Montes and another man were seen on a police surveillance camera striking a man seated in a car late Friday in the 2300 block of South Washtenaw Avenue, police say.
Russia launched the attack overnight Saturday with Iranian-made Shahed drones, a senior Kyiv military official said. The attack lasted more than five hours, with air defense reportedly shooting down more than 40 drones.
Chicago historically records surges of shootings on Memorial Day weekend with the unofficial start of summer.
About nine months into operating the clinic for asylum seekers, Cook County Health is facing at least a $40 million drop in revenue this year. A number of financial woes are colliding.
The victims were sitting in a parked car in the 4100 block of West Taylor Street when they were shot, police said.