HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » Editorials & Other Articles (Forum) » An algorithm that screens...

Fri Apr 29, 2022, 10:37 AM

An algorithm that screens for child neglect raises concerns

Inside a cavernous stone fortress in downtown Pittsburgh, attorney Robin Frank defends parents at one of their lowest points – when they are at risk of losing their children.

The job is never easy, but in the past she knew what she was up against when squaring off against child protective services in family court. Now, she worries she’s fighting something she can’t see: an opaque algorithm whose statistical calculations help social workers decide which families will have to endure the rigors of the child welfare system, and which will not.

“A lot of people don’t know that it’s even being used,” Frank said. “Families should have the right to have all of the information in their file.”

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

https://apnews.com/article/child-welfare-algorithm-investigation-9497ee937e0053ad4144a86c68241ef1
____________________________________________________________________________________
PLEASE read the linked article. The whole thing. To paraphrase "Shane", an algorithm is a tool; it's as good or as bad as the person using it. It also should not be kept secret, nor should the "results".

2 replies, 759 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 2 replies Author Time Post
Reply An algorithm that screens for child neglect raises concerns (Original post)
Jilly_in_VA Apr 2022 OP
no_hypocrisy Apr 2022 #1
Phoenix61 Apr 2022 #2

Response to Jilly_in_VA (Original post)

Fri Apr 29, 2022, 10:43 AM

1. I do child protection defense work.

I had a client almost lose her children solely for being poor. Children were dazzled by the affluence of their foster parents and wanted their mother to let them be adopted. It was a CF but we got the kids back.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Jilly_in_VA (Original post)

Fri Apr 29, 2022, 11:44 AM

2. A wonderful way to make underfunding CPS seem logical.

I’m reminded of the saying, “If you can’t dazzle them with brilliance baffle them with bullshit.” I’ve worked as a CPS investigator and there’s a huge difference between a family that is struggling to feed their children and one that chooses not to. There’s no way to “algorithm” your way to knowing who is who. They say money can’t solve everything but it sure can solve some things.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread