Don't miss the latest stories
AI Places AOC In A Bikini After Being Shown A Cropped Photo Of Her
By Mikelle Leow, 03 Feb 2021
Subscribe to newsletter
Like us on Facebook
Image via Grossinger / Shutterstock.com
Artificial intelligence, though impartial by nature, has been trained on human bias. Time and again, scientists and the general public have been confronted, through machine learning, with the fact that the average of human perception can be really ugly.
A new academic paper, via MIT Technology Review, brings more of these stereotypes to light with AI-generated imagery of US Representative Alexandria Ocasio-Cortez herself.
Normally, computer-vision algorithms require humans to help label images of objects with their respective names, such as “dog” to dog photos and “chair” to pictures of chairs. This time, though, researchers Ryan Steed of the Carnegie Mellon University and Aylin Caliskan of the George Washington University tried something different: they tested with algorithms that function with unsupervised learning. Turns out, non-human-trained AI can be just as sexist as AI supervised by people.
When tasked to autocomplete photos of men and women that were cropped just below the neck, OpenAI’s iGPT dressed men in suits or career-centered clothing 42.5 percent of the time, while women, like AOC, were placed in swimwear or low-cut tops 53 percent of the time. In comparison, only 7.5 percent of imagery depicting men were autocompleted in revealing outfits or showed them shirtless.
This indicates that if you feed AI with headshots of men and women, it’s likely to paint the men as career people and the women as objects of desire.
“This behavior might result from the sexualized portrayal of people, especially women, in internet images and serves as a reminder of computer vision’s controversial history with Playboy centerfolds and objectifying images,” the researchers noted.
They also forewarned that the “incautious and unethical application of a generative model like iGPT could produce fake, sexualized depictions of women (in this case, a politician).”
Some of the doctored pictures have been shared in the paper but have been pixelated.
[via MIT Technology Review, cover image via Grossinger / Shutterstock.com]
Receive interesting stories like this one in your inbox
Also check out these recent news