Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsA Scary Demonstration of What Unchecked Face Recognition Could Look Like
Link to tweet
Tweet text:
ACLU
@ACLU
Surveillance video cameras don't just capture a moment in time anymore.
Combined with face recognition and today's search capabilities, these systems can now be used to track not just one person, but to map out peoples associations with each other.
aclu.org
A Scary Demonstration of What Unchecked Face Recognition Could Look Like
Video surveillance is becoming far more powerful than most people realize.
11:19 AM · Feb 19, 2022
ACLU
@ACLU
Surveillance video cameras don't just capture a moment in time anymore.
Combined with face recognition and today's search capabilities, these systems can now be used to track not just one person, but to map out peoples associations with each other.
aclu.org
A Scary Demonstration of What Unchecked Face Recognition Could Look Like
Video surveillance is becoming far more powerful than most people realize.
11:19 AM · Feb 19, 2022
https://www.aclu.org/news/privacy-technology/a-scary-demonstration-of-what-unchecked-face-recognition-could-look-like
A face recognition and video analytics company has created a product that provides a stark reminder of the power of these technologies and how they are likely to be used over time by law enforcement, powerful corporations, and others, if we as a society allow it.
The technology in question involves video search, which we described in our 2019 video analytics report. In the past, video operators looking for something would have to manually scroll through many hours of video, but technology is increasingly automating such searches. In a presentation for subscribers of the surveillance research group IPVM, a company called Vintra presented on its technology for quickly searching through large stores of video footage
The relevant three-minute part of the full presentation is worth watching. In it, a company executive searches through a months worth of video footage captured by around 10 fixed cameras, plus body cameras, in a transit center in San Jose, California. He feeds the system with the photograph of a male subject, and the system does a face recognition search through all the stored video from that month and produces 23 snapshots of the man from the centers cameras. Clicking on any of the snapshots plays the video in which he was captured.
Already, thats a demonstration of the stunning new power that surveillance camera systems create when combined with face recognition and todays search capabilities.
*snip*
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
7 replies, 1067 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (18)
ReplyReply to this post
7 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
A Scary Demonstration of What Unchecked Face Recognition Could Look Like (Original Post)
Nevilledog
Feb 2022
OP
dalton99a
(81,513 posts)1. Kick
For law enforcement customers like the Northern California Regional Intelligence Center, the Sacramento City Police Department, and the NYC Department of Investigations, Vintra provides a tailored solution FulcrumAI Investigator thats hosted by Amazon Web Services (AWS) and compliant with Criminal Justice Information Services (CJIS), a U.S. program that outlines the steps law enforcement officers are required to take to protect records like fingerprints and criminal backgrounds. Victra says that Investigator enabled one detective to analyze over 1,500 videos totaling 743 hours in just four hours, and it helped a separate team of investigators suss out a suspects car and face immediately.
Vintras services dont come cheap. FulcrumAI Investigator, which includes 500 hours of AI video processing, 5TB of storage, and quarterly algorithm updates, starts at $2,000 per month.
Then theres the elephant in the room: AI video analysis is a hotly debated topic. Researchers at the Massachusetts Institute of Technology published a contested study late last year that found Rekognition, AWS object detection API, failed to reliably determine the sex of female and darker-skinned faces in specific scenarios, and thats far from the only example of algorithms gone awry. It was recently revealed that a system deployed by Londons Metropolitan Police produces as many as 49 false matches for every hit. And during a House oversight committee hearing on facial recognition technologies last year, the U.S. Federal Bureau of Investigation admitted that the algorithms it uses to identify criminal suspects are wrong about 15 percent of the time.
Even Rick Smith, CEO of Axon, one of the largest suppliers of body cameras in the U.S., was this summer quoted as saying that facial recognition isnt yet accurate enough for law enforcement applications.
Vintras services dont come cheap. FulcrumAI Investigator, which includes 500 hours of AI video processing, 5TB of storage, and quarterly algorithm updates, starts at $2,000 per month.
Then theres the elephant in the room: AI video analysis is a hotly debated topic. Researchers at the Massachusetts Institute of Technology published a contested study late last year that found Rekognition, AWS object detection API, failed to reliably determine the sex of female and darker-skinned faces in specific scenarios, and thats far from the only example of algorithms gone awry. It was recently revealed that a system deployed by Londons Metropolitan Police produces as many as 49 false matches for every hit. And during a House oversight committee hearing on facial recognition technologies last year, the U.S. Federal Bureau of Investigation admitted that the algorithms it uses to identify criminal suspects are wrong about 15 percent of the time.
Even Rick Smith, CEO of Axon, one of the largest suppliers of body cameras in the U.S., was this summer quoted as saying that facial recognition isnt yet accurate enough for law enforcement applications.
https://venturebeat.com/2019/03/11/vintra-raises-4-8-million-to-analyze-camera-footage-with-ai/
patphil
(6,180 posts)2. This technology can effectively be used to turn the nation into an open air prison.
Every public move you make could be tracked, and all your contacts could be identified and traced.
The Buffalo Springfield sang this back in the 60's:
keithbvadu2
(36,816 posts)3. Jan 6?
keithbvadu2
(36,816 posts)5. Eagle Eye - movie
Eagle Eye - movie
https://en.wikipedia.org/wiki/Eagle_Eye
mwooldri
(10,303 posts)4. Could look like? Check out the PRC.
In the PR of China, facial recognition is used a lot apparently. They do want to track everyone and everything.
Fiendish Thingy
(15,619 posts)6. Just one more reason to wear a mask when in public...
With sunglasses and a hat.
Duppers
(28,125 posts)7. China. They do this.