- USDT(TRC-20)
- $0.0
Despite explaining away issues with its AI Overviews while promising to make them better, Google is still apparently telling people to put glue in their pizza. And in fact, articles like this are only making the situation worse.
When they launched to everyone in the U.S. shortly after Google I/O, AI Overviews immediately became the laughing stock of search, telling people to eat rocks, use butt plugs while squatting, and, perhaps most famously, to add glue to their homemade pizza.
Most of these offending answers were quickly scrubbed from the web, and Google issued a somewhat defensive apology. Unfortunately, if you use the right phrasing, you can reportedly still get these blatantly incorrect "answers" to pop up.
In a post on June 11, Bluesky user Colin McMillen said he was still able to get AI Overviews to tell him to add ā1/8 cup, or 2 tablespoons, of white, nontoxic glue to pizza sauceā when asking āhow much glue to add to pizza.ā
The question seems purposefully designed to mess with AI Overviews, sureāalthough given the recent discourse, a well-meaning person whoās not so terminally online might legitimately be curious what all the hubbub is about. At any rate, Google did promise to address even leading questions like these (as it probably doesnāt want its AI to appear to be endorsing anything that could make people sick), and it clearly hasnāt.
Perhaps more frustrating is the fact that Googleās AI Overview sourced the recent pizza claim to Katie Notopoulus of Business Insider, who most certainly did not tell people to put glue in their pizza. Rather, Notopoulus was reporting on AI Overviewās initial mistake; Googleās AI just decided to attribute that mistake to her because of it.
āGoogleās AI is eating itself already,ā McMillen said, in response to the situation.
I wasnāt able to reproduce the response myself, but The Verge did, though with different wording: The AI Overview still cited Business Insider, but rightly attributed the initial advice to to Googleās own AI. Which means Google AIās source for its ongoing hallucination is...itself.
Whatās likely going on here is that Google stopped its AI from using sarcastic Reddit posts as sources, but itās now turning to news articles reporting on its mistakes to fill in the gaps. In other words, as Google messes up, and as people report on it, Google will then use that reporting to back its initial claims. The Verge compared it to Google bombing, an old tactic where people would link the words āmiserable failureā to a photo of George W. Bush so often that Google images would return a photo of the president when you searched for the phrase.
Google is likely to fix this latest AI hiccup soon, but itās all bit of a ālaying the train tracks as you go situation,ā and certainly not likely to do anything to improve AI search's reputation.
Anyway, just in case Google attaches my name to a future AI Overview as a source, I want to make it clear: Do not put glue in your pizza (and leave out the pineapple while youāre at it).
Full story here:
When they launched to everyone in the U.S. shortly after Google I/O, AI Overviews immediately became the laughing stock of search, telling people to eat rocks, use butt plugs while squatting, and, perhaps most famously, to add glue to their homemade pizza.
Most of these offending answers were quickly scrubbed from the web, and Google issued a somewhat defensive apology. Unfortunately, if you use the right phrasing, you can reportedly still get these blatantly incorrect "answers" to pop up.
In a post on June 11, Bluesky user Colin McMillen said he was still able to get AI Overviews to tell him to add ā1/8 cup, or 2 tablespoons, of white, nontoxic glue to pizza sauceā when asking āhow much glue to add to pizza.ā
The question seems purposefully designed to mess with AI Overviews, sureāalthough given the recent discourse, a well-meaning person whoās not so terminally online might legitimately be curious what all the hubbub is about. At any rate, Google did promise to address even leading questions like these (as it probably doesnāt want its AI to appear to be endorsing anything that could make people sick), and it clearly hasnāt.
Perhaps more frustrating is the fact that Googleās AI Overview sourced the recent pizza claim to Katie Notopoulus of Business Insider, who most certainly did not tell people to put glue in their pizza. Rather, Notopoulus was reporting on AI Overviewās initial mistake; Googleās AI just decided to attribute that mistake to her because of it.
āGoogleās AI is eating itself already,ā McMillen said, in response to the situation.
I wasnāt able to reproduce the response myself, but The Verge did, though with different wording: The AI Overview still cited Business Insider, but rightly attributed the initial advice to to Googleās own AI. Which means Google AIās source for its ongoing hallucination is...itself.
Whatās likely going on here is that Google stopped its AI from using sarcastic Reddit posts as sources, but itās now turning to news articles reporting on its mistakes to fill in the gaps. In other words, as Google messes up, and as people report on it, Google will then use that reporting to back its initial claims. The Verge compared it to Google bombing, an old tactic where people would link the words āmiserable failureā to a photo of George W. Bush so often that Google images would return a photo of the president when you searched for the phrase.
Google is likely to fix this latest AI hiccup soon, but itās all bit of a ālaying the train tracks as you go situation,ā and certainly not likely to do anything to improve AI search's reputation.
Anyway, just in case Google attaches my name to a future AI Overview as a source, I want to make it clear: Do not put glue in your pizza (and leave out the pineapple while youāre at it).
Full story here: