Image via Shutterstock
Though impartial by nature, artificial intelligence learns its stuff from humans, who aren’t the most perfect role models either. This has spawned umpteen incidents where AI has seemingly replicated
the “male gaze” and the such.
Last September, Twitter’s so-called “intelligent” photo cropping algorithm was
criticized for prioritizing Caucasian over Black people. The AI’s “bias” was brought to users’ attention through tweets with both former US president Barack Obama and former Senate Majority Leader Mitch McConnell in the pictures. Somehow, the algorithm only cropped them to reveal McConnell’s face:
To creators’ glee, Twitter has recently ditched the cropping feature in its mobile feeds, showcasing full-size images instead. However, it has now admitted that part of the reason cropped image previews have been removed is because, in
its research, the algorithm had indeed discriminated based on race and gender.
Twitter’s software engineering director Rumman Chowdhury explained in a blog post that the photo-cropping AI was trained on “human eye-tracking data” indicating “how the human eye looks at a picture as a method of prioritizing what’s likely to be most important to the most people.” It would then crop images based on what has been considered a “priority” to human perception.
To check for bias in a quantitative manner, the social network then experimented with “randomly linked images” portraying people of different races and genders.
It found that the AI showed an “8% difference from demographic parity in favor of women” when shown images of men and women, a “4% difference from demographic parity in favor of white individuals” in comparison with Black people, a “7% difference from demographic parity in favor of white women” across images of Black and white women, and a “2% difference from demographic parity in favor of white men” in pictures of Black and white men.
Twitter also tried testing if the photo-cropping algorithm was prone to assenting to the “male gaze” and if it would zero in on a woman’s chest or legs instead of her face. However, this experiment didn’t turn up significant results.
The company noted that “for every 100 images per group, about three cropped at a location other than the head,” and “when images weren’t cropped at the head, they were cropped to non-physical aspects of the image, such as a number on a sports jersey.”
Nevertheless, the overall statistics were compelling enough for Twitter to remove the photo-cropping feature and display full-size images instead.
Now, prior to posting their tweets, users will also be shown a preview of how their images might appear when published.
[via
PetaPixel, images via various sources]