Why Stanford Researchers Tried to Create a Gaydar Machine
The New York Times:
An Israeli start-up had started hawking a service that predicted terrorist proclivities based on facial analysis. Chinese companies were developing facial recognition software not only to catch known criminals but also to help the government predict who might break the law next.
And all around Silicon Valley, where Dr. Kosinski works as a professor at Stanford Graduate School of Business, entrepreneurs were talking about faces as if they were gold waiting to be mined.
Few seemed concerned. So to call attention to the privacy risks, he decided to show that it was possible to use facial recognition analysis to detect something intimate, something people should have full rights to keep private.
After considering atheism, he settled on sexual orientation.
Whether he has now created A.I. gaydar, and whether thats even an ethical line of inquiry, has been hotly debated over the past several weeks, ever since a draft of his study was posted online.
Presented with photos of gay men and straight men, a computer program was able to determine which of the two was gay with 81 percent accuracy, according to Dr. Kosinski and co-author Yilun Wangs paper.