top of page
Writer's pictureAnn Wrixon

Can Big Data Make Accurate Predictions About Child Abuse?



Ann Wrixon blog about predictive analytics in child welfare

“Predictive analytics refers to the practice of extracting information from existing data sets and identifying patterns that may help to predict future outcomes.” (p. 4 of Chapin Hall/Chadwick Center report) Increasingly, child welfare systems are using these systems to try to determine which children may be in imminent danger from abuse and/or neglect. In September 2018, Chapin Hall at the University of Chicago and the Chadwick Center released a report titled, “Making the Most of Predictive Analytics: Responsible and Innovative Uses in Child “Welfare Policy and Practice” that examines this practice detailing the advantages and problems with using predictive analytics in child welfare and making recommendations for best practices. Many of these recommendations echo the ones made by The National Council on Crime and Delinquency in December of 2017 in their report titled, “Principles for Predictive Analytics in Child Welfare" and by the Kirwan Institute at Ohio State University in a February 2017 report titled, “Foretelling the Future A Critical Perspective on the Use of Predictive Analytics in Child Welfare.

The reports state that predictive analytics can have a positive impact on child welfare practice, but that there are very large dangers associated with its use, including inaccurate predictions that either do not protect children or penalize families that are not abusive or neglectful. Most importantly, they all warn that it is well documented that the child welfare system has a strong racial bias against families of color, and that if we use past information, we will encode this bias into the predictive analytics with devastating effects on families of color. As these reports point out the formulas do not use “race” as a determining factor, but other information can be stand-in for race, including zip codes, and socio-economic status, which can serve as cover for encoding these racial biases in the predictive analytic systems.

Another serious, but often overlooked problem, is the misunderstanding that correlation does not mean causation. The systems may find strong correlations such as one did in New Zealand that children whose families apply for public benefits before the child reaches the age of two are more likely to later have child welfare involvement. This is a correlation. Getting public benefits does not cause child abuse. This sort of misunderstanding can lead to serious errors on both an individual and public policy level.

Additionally, the reports note that many of the current systems available on the market protect their “intellectual property” by refusing to disclose how their systems make their predictions. The reports all note that child welfare systems should not use these “black-box” systems as transparency is essential to ensure that the predictions are accurate and fair.

286 views0 comments

Recent Posts

See All
bottom of page