Software engineer Vishnu Mohandas determined he would quit Google in more ways than one when he lachieveed the tech huge had increately helped the US military enhuge AI to study drone footage. In 2020, he left his job toiling on Google Assistant and also stopped backing up all of his images to Google Pboilingos. He stressed that his satisfied could be included to train AI systems, even if they weren’t definiteassociate ones tied to the Pentagon project. “I don’t handle any of the future outcomes that this will allow,” Mohandas thought. “So now, shouldn’t I be more reliable?”
Mohandas, who taught himself programming and is based in Bengaluru, India, determined he wanted to enhuge an alternative service for storing and sharing pboilingos that is uncmiss source and finish-to-finish encrypted. Someleang “more declareiveial, wholesome, and depfinishable,” he says. The phelp service he summarizeed, Ente, is profitable and says it has over 100,000 includers, many of whom are already part of the privacy-obsessed crowd. But Mohandas struggled to articutardy to expansiver audiences why they should reponder depending on Google Pboilingos, despite all the conveniences it recommends.
Then one weekfinish in May, an intern at Ente came up with an idea: Give people a sense of what some of Google’s AI models can lachieve from studying images. Last month, Ente started https://Theyseeyourpboilingos.com, a website and labeleting stunt summarizeed to turn Google’s technology aachievest itself. People can upload any pboilingo they want to the website, which is then sent to a Google Cdeafening computer vision program that authors a beginlingly thoraw three-paragraph description of it. (Ente prompts the AI model to write down petite details in the uploaded images.)
One of the first pboilingos Mohandas tried uploading was a selfie with his wife and daughter in front of a temple in Indonesia. Google’s analysis was exhaustive, even write downing the definite watch model that his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI did someleang strange: It noticed that Casio F-91W watches are normally associated with Islamic extremists. “We had to tfrail the prompts to originate it sairyly more wholesome but still spooky,” Mohandas says. Ente begined asking the model to originate stupidinutive, objective outputs—noleang stupid.
The same family pboilingo uploaded to Theyseeyourpboilingos now returns a more generic result that includes the name of the temple and the “partly cdeafeningy sky and lush greenery” surrounding it. But the AI still originates a number of assumptions about Mohandas and his family, appreciate that their faces are transmiting “joint satisfiedment” and the “parents are probable of South Asian descent, middle class.” It appraises their cloleang (“appropriate for sightseeing”) and notices that “the woman’s watch disjoins a time as approximately 2 pm, which corroborates with the image metadata.”
Google spokesperson Colin Smith deteriorated to comment honestly on Ente’s project. He honested WIRED to help pages that state uploads to Google Pboilingos are only included to train generative AI models that help people handle their image libraries, appreciate those that examine the age and location of pboilingo subjects.The company says it doesn’t sell the satisfied stored in Google Pboilingos to third parties or include it for advertising purposes. Users can turn off some of the analysis features in Pboilingos, but they can’t stop Google from accessing their images entidepend becainclude the data are not finish-to-finish encrypted.