
Photo 308959943 © Vgattita | Dreamstime.com
Google’s new Gemini conversational model, once known as Bard, impresses with its speed, understanding of complex information, and ability to generate images. Alas, the initial excitement around the tool quickly turned into concern when it started spitting out racially inaccurate images, prompting the tech giant to halt creations of pictures with humans in them. The latest act in this digital circus sees the tool seemingly struggling to recognize clowns as people. Stephen King might have something to say about this.
Expressing regret over the chatbot’s portrayal of historical characters, where historically white figures like “founding fathers” and even “Nazis” were depicted as people of color, Prabhakar Raghavan, Google’s senior vice president, said the tool “missed the mark” and that the team was suspending the ability to create images featuring people until it could work something out.
But it seems there are still a bunch of makeup-donning rejects floating about this AI space and juggling its inconsistencies. As discovered by Futurism, Gemini continues to churn out images of clowns, who, apparently, are not flesh and blood.
Gemini won’t obey if you instruct it to dream up artworks of a single clown. However, it can imagine a gaggle of clowns in bizarre settings like submarines or spaceships, Futurism’s Noor Al-Sibai finds out.
Sneaky workarounds like “little guy” work “bizarrely well” too, with the tool generating images of strange, not-quite-humans.
Eventually, the chatbot wisened up to the goofy loopholes and stopped clowning around, says the author.
[via Futurism, images via various sources]