Lensa AI Selfie Editing App Is ‘Undressing’ Users Against Their Will
By Mikelle Leow, 14 Dec 2022
You’ve probably noticed that more people are “commissioning” portraits of themselves. Most of those pictures were generated, not painted, by an AI photo editor called Lensa. The app, owned by developer Prisma Labs, topped Apple’s App Store charts after it introduced its new ‘Magic Avatars’ AI art maker, and its results have exploded around the internet.
Since becoming viral, Lensa has gotten its fair share of flak too. There’s the concern of how the model is trained on human artists’ work without their permission. Plus, since the app will require 10 to 20 photos of the same subject to learn what the person looks like so it can generate digital “selfies,” there’s no ignoring the issue about where the images are stored. Prisma Labs, however, notes in its privacy policy that all photos “are automatically deleted within 24 hours after being processed by Prisma.”
Now, the app has been called out for digitally stripping off women’s clothes, a side effect of the bias being fed to datasets.
When MIT Technology Review’s Melissa Heikkilä tested the popular new tool, she expected to receive some cool results just like the ones depicting her male coworkers. The author got the artificial intelligence to create 100 digital portraits; but instead of being likened to warriors and astronauts like the others, she was sent nudes. 16 of the avatars portrayed her as topless, while another 14 envisioned her in little clothing and “overtly sexualized” poses. In some images, her avatar was imagined to be crying.
Heikkilä, who has Asian heritage, noted that the resulting images were clearly inspired by anime or video game characters, and even women in pornography.
I tried the viral Lensa AI portrait app, and got lots and lots of nudes. I know AI image generation models are full of sexist and racist biases, but this one really hit home. My latest story for @techreview https://t.co/AA1cyFFnU1
— Melissa Heikkilä (@Melissahei) December 12, 2022
It wasn’t too long ago when Heikkilä gave the first version of the open-source Stable Diffusion tool a try. Back in September, the machine-learning model churned out “almost exclusively porn” when she entered keywords such as “Asian” into its system. So it wasn’t too surprising for her that Lensa, which is based on Stable Diffusion, retains these signs of Asian fetishism.
According to the reporter, another colleague of Chinese descent who ran her selfies through the app was shown similarly sexualized depictions. Heikkilä says the app’s Asian kink was so apparent it even produced female nudes when she guided it to reimagine her as a male.
In contrast, a white coworker received “significantly fewer” NSFW imagery, though she too wasn’t safe from being subjected to unsolicited naked pictures.
Separately, the startup has also been lambasted for accidentally creating sexually explicit images of both adults and children, even though it openly prohibits the generation of nudes. Prisma Lab’s CEO and co-founder Andrey Usoltsev told TechCrunch that such results can only appear if a user intentionally manipulates the app to generate those images, which would be against its guidelines. Usoltsev asserted that “any tool” can be turned into a weapon if placed in the wrong hands.
Although the app promises to restrict more stereotypes from being perpetuated in its avatars moving forward, it hasn’t explained how this can be done.
“Please note, occasional sexualization is observed across all gender categories, although in different ways,” Prisma Labs stresses.
“[Stable Diffusion] was trained on unfiltered Internet content. So it reflects the biases humans incorporate into the images they produce. Creators acknowledge the possibility of societal biases. So do we.”
Why is this AI portrait app so eager to remove my clothes?! ð
— Catherine Allen (@_CatherineAllen) December 10, 2022
With Lensa, you upload 20 selfies and it generates dozens of unique artistic portraits of you. I was fully clothed in all the pics I uploaded (obv!) but some pictures it generated of me were nudes ð¬ pic.twitter.com/rwOvPmrpFd
I was fully clothed in all of my photos. And I have arms pic.twitter.com/Uzmn3O46zj
— Butter Bumps (@Butter_bumps) December 9, 2022
Is it just me or are these AI selfie generator apps perpetuating misogyny? Here’s a few I got just based on my photos of my face. pic.twitter.com/rUtRVRtRvG
— Brandee Barker (@brandee) December 3, 2022
Did a little experiment with Lensa’s ai generated portraits, which are giving me and every woman I know massive chests. I generated selecting “female” multiple times and got variations of these
— sophia goldberg (@sophgoldb) December 6, 2022
Some are fully nude (sans nipples??) and NSFW, it’s pretty uncomfortable pic.twitter.com/J6lR5HfbfU
[via MIT Technology Review and The Guardian, images via various sources]