Artificial Intelligence (AI) can now accurately identify a person’s sexual orientation by analyzing photos of their face, according to new research.
The Stanford University study, which is set to be published in the Journal of Personality and Social Psychology and was first reported in The Economist, found that machines had a far superior “gaydar” when compared to humans.
The machine intelligence tested in the research could correctly infer between gay and straight men 81 percent of the time, and 74 percent of the time for women. In contrast, human judges performed much worse than the sophisticated computer software, identifying the orientation of men 61 percent of the time and guessing correctly 54 percent of the time for women.
The research has prompted critics to question the possible use of this type of machine intelligence, both in terms of the ethics of facial-detection technology and whether it could be used to violate a person’s privacy.
Michal Kosinski and Yilun Wang, the lead researchers of the study, suggested the software was able to find subtle differences in facial structure between gay and straight people and therefore could accurately conclude their sexual orientation.
‘Threat to safety and privacy’
The Stanford University researchers found that gay men and women typically had “gender-atypical” features and expressions. While a person’s “grooming style” also factored in to the computer algorithm, essentially suggesting gay women appeared more masculine and vice versa.
When the AI reviewed five images of a person’s face, rather than one, the results were even more convincing – 91 percent of the time with men and 83 percent of the time with women.
The paper indicated its findings showed “strong support” for the theory that a person’s sexual orientation stems from the exposure to various hormones before birth. The AI’s success rate in comparison to human judges also appeared to back the concept that female sexual orientation is more fluid.
The researchers behind the study argued that with the appropriate data sets, similar AI tests could spot other personal traits such as an individual’s IQ or even their political views. However, Kosinski and Wang also warned of the potentially dangerous ramifications such AI machines could have on the LGBT community.
“Given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women,” Kosinski and Wang said in the report.