General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsA Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.
Link to tweet
Timothy Noah
@TimothyNoah1
·
Follow
Corporations and governments are making erroneous decisions by algorithm with no, or minimal, feedback loop. This extends well beyond child porn allegations. Weve the same dynamic with people accused erroneously of welfare fraud or mortgage nonpayment.
Mark with his son this month.
nytimes.com
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.
Google has an automated tool to detect abusive images of children. But the system can get it wrong, and the consequences are serious.
2:16 PM · Aug 21, 2022
https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html
No paywall
https://archive.ph/0Rj7l
Mark noticed something amiss with his toddler. His sons penis looked swollen and was hurting him. Mark, a stay-at-home dad in San Francisco, grabbed his Android smartphone and took photos to document the problem so he could track its progression.
It was a Friday night in February 2021. His wife called an advice nurse at their health care provider to schedule an emergency consultation for the next morning, by video because it was a Saturday and there was a pandemic going on. The nurse said to send photos so the doctor could review them in advance.
Marks wife grabbed her husbands phone and texted a few high-quality close-ups of their sons groin area to her iPhone so she could upload them to the health care providers messaging system. In one, Marks hand was visible, helping to better display the swelling. Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible, or what those giants might think of the images.
With help from the photos, the doctor diagnosed the issue and prescribed antibiotics, which quickly cleared it up. But the episode left Mark with a much larger problem, one that would cost him more than a decade of contacts, emails and photos, and make him the target of a police investigation. Mark, who asked to be identified only by his first name for fear of potential reputational harm, had been caught in an algorithmic net designed to snare people exchanging child sexual abuse material.
*snip*
usonian
(9,813 posts)Perhaps the medical office had a regular email address (HIPAA not amused) or the patient was using gmail, or automatically uploading photos to google storage.
This is not right in the specifics.
In the general case, algorithms are garbage IMNSHO.
Ask your below-500 San Francisco Giants.
Ms. Toad
(34,074 posts)to upload.
Not to mention that the problem arisis if you allow your photos to be stored in the cloud, since scanning happens at the time of storage.
It wasn't the transmission to the medical office that triggered the action.
usonian
(9,813 posts)As of ios 15, Apple is scanning your photos on the device to match signatures of CP. But only if you use iCloud Photos.
The algorithm is opaque, as is the database. You can't get it. So, the chances of a false positive are a matter of "faith in algorithms"
Probably the same for macos 15 -- they changed the numbering system there.
And of course, cloud providers not only check for pr0n, but they checksum copyrighted material. Don't ask me more because there's the chance that you actually bought the material. Oh, I recall that you can have it but can't share it, or link someone to it.
I've been looking for end-to-end encrypted storage without using cryptomator or equivalent. They exist. I am just a bit allergic to spending money on hosting of anything. I ran computer rooms, and like a hands-on computer setup.
Trust nobody. And trust yourself only as much as your skill and diligence level merit.
3catwoman3
(24,006 posts)...I'm glad I'm retired.