Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Nevilledog

(51,120 posts)
Sun Aug 21, 2022, 05:24 PM Aug 2022

A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.



Tweet text:

Timothy Noah
@TimothyNoah1
·
Follow
Corporations and governments are making erroneous decisions by algorithm with no, or minimal, feedback loop. This extends well beyond child porn allegations. We’ve the same dynamic with people accused erroneously of welfare fraud or mortgage nonpayment.
Mark with his son this month.
nytimes.com

A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.
Google has an automated tool to detect abusive images of children. But the system can get it wrong, and the consequences are serious.
2:16 PM · Aug 21, 2022


https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html

No paywall
https://archive.ph/0Rj7l

Mark noticed something amiss with his toddler. His son’s penis looked swollen and was hurting him. Mark, a stay-at-home dad in San Francisco, grabbed his Android smartphone and took photos to document the problem so he could track its progression.

It was a Friday night in February 2021. His wife called an advice nurse at their health care provider to schedule an emergency consultation for the next morning, by video because it was a Saturday and there was a pandemic going on. The nurse said to send photos so the doctor could review them in advance.

Mark’s wife grabbed her husband’s phone and texted a few high-quality close-ups of their son’s groin area to her iPhone so she could upload them to the health care provider’s messaging system. In one, Mark’s hand was visible, helping to better display the swelling. Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible, or what those giants might think of the images.

With help from the photos, the doctor diagnosed the issue and prescribed antibiotics, which quickly cleared it up. But the episode left Mark with a much larger problem, one that would cost him more than a decade of contacts, emails and photos, and make him the target of a police investigation. Mark, who asked to be identified only by his first name for fear of potential reputational harm, had been caught in an algorithmic net designed to snare people exchanging child sexual abuse material.

*snip*

4 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal. (Original Post) Nevilledog Aug 2022 OP
All medical stuff should be going through an encrypted portal (via https). usonian Aug 2022 #1
The article indicated that he sent the photos to his wife Ms. Toad Aug 2022 #3
Yes, I should have de-emphasized the hospital portal. They know better. usonian Aug 2022 #4
Just one of the various hazards of telemedicine - one of the big reasons... 3catwoman3 Aug 2022 #2

usonian

(9,813 posts)
1. All medical stuff should be going through an encrypted portal (via https).
Sun Aug 21, 2022, 05:33 PM
Aug 2022

Perhaps the medical office had a regular email address (HIPAA not amused) or the patient was using gmail, or automatically uploading photos to google storage.

This is not right in the specifics.

In the general case, algorithms are garbage IMNSHO.

Ask your below-500 San Francisco Giants.

Ms. Toad

(34,074 posts)
3. The article indicated that he sent the photos to his wife
Sun Aug 21, 2022, 10:32 PM
Aug 2022

to upload.

Not to mention that the problem arisis if you allow your photos to be stored in the cloud, since scanning happens at the time of storage.

It wasn't the transmission to the medical office that triggered the action.

usonian

(9,813 posts)
4. Yes, I should have de-emphasized the hospital portal. They know better.
Sun Aug 21, 2022, 10:53 PM
Aug 2022

As of ios 15, Apple is scanning your photos on the device to match signatures of CP. But only if you use iCloud Photos.

The algorithm is opaque, as is the database. You can't get it. So, the chances of a false positive are a matter of "faith in algorithms"

Probably the same for macos 15 -- they changed the numbering system there.

And of course, cloud providers not only check for pr0n, but they checksum copyrighted material. Don't ask me more because there's the chance that you actually bought the material. Oh, I recall that you can have it but can't share it, or link someone to it.

I've been looking for end-to-end encrypted storage without using cryptomator or equivalent. They exist. I am just a bit allergic to spending money on hosting of anything. I ran computer rooms, and like a hands-on computer setup.

Trust nobody. And trust yourself only as much as your skill and diligence level merit.

Latest Discussions»General Discussion»A Dad Took Photos of His ...