The dark side of AI: Your sexuality can be gleaned from your picture
A machine learning system that can detect the sexual orientation of people has been created.
In a bid to flag a major threat to privacy, researchers at Standford University have demonstrated how AI can be used to tell whether people are gay or straight from their faces.
As tech leaders continue to flag the risks of AI-backed "killer robots", the study from these researchers unravels the darker face of machine intelligence, showing how it can be used to determine the sexuality of an individual just by analysing a bunch of facial images.
The research, due to be published in the Journal of Personality and Social Psychology, details a deep neural network — which learns to analyse visual [in this case facial] features using large datasets. The authors of the study, Michal Kosinski and Yilun Wang fed the deep learning network with a sample of over 35,000 images posted by men and women on a public dating site.
The system extracted and quantified features from those images, correlating specific ones to find patterns and identify the sexuality of different individuals. Though the researchers did not feed any preconceived factors regarding the appearance of gay and straight individuals, the system — when tested with pictures of two faces (one gay, one straight) — was highly successful in correlating certain trends to identify sexual orientation.
To be specific, it delivered 91% accuracy with men and 83% with women — much higher than human judges who were only successful 64% of the time for men and 54% for women. The accuracy went down when the number of people was increased to 1,000 with only 7% gay men.
The general trends found in the research revealed gay men appeared more feminine and vice versa, according to a report in the Guardian. It also noted that gay men had narrower jaws and larger foreheads, while their female counterparts had larger jaws and smaller foreheads.
Many studies have shown the bright side of AI, but this could be a big threat to an individual's privacy, especially at a time when social media and other databases store a plethora of images. A system like this comes with limitless implications, the worst of which could hypothetically thwart the privacy of LGBTQ (Lesbian, Gay, Bisexual, Trans and Queer) groups, which are still oppressed in different parts of the world.
Many would consider the documentation of this technology wrong, but the researchers have argued that the system already exists and it was essential to "make policymakers and LGBTQ communities aware of the risks that they are facing".
"We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats," they said in an extensive set of authors' notes.
© Copyright IBTimes 2024. All rights reserved.