Google AI Summaries Can Now Understand Fabricated Terms


Google’s AI Overviews Exhibit a Comical Tendency to Fabricate Meanings for Nonexistent Idioms

Google’s AI Overviews occasionally behave like an individual who is utterly disoriented yet refuses to seek guidance — confidently incorrect instead of acknowledging doubt.

This peculiar trait has become a source of entertainment online, as users have found that entering any invented idiom into Google followed by the term “meaning” often prompts the AI to generate a self-assured — and entirely fictitious — explanation. It exemplifies AI hallucination, wherein the system creates information to bridge gaps, even in the absence of supporting data.

The phenomenon has gained traction across social media. Individuals are pushing the boundaries of Google’s AI by inputting nonsensical phrases such as “You can’t lick a badger twice” and observing the AI produce a seemingly credible interpretation. SEO specialist Lily Ray called this occurrence “AI-splaining.”

Historian Greg Jenner joined the festivities by sharing his own fabricated idiom alongside the AI’s sincere attempt to clarify it. Others contributed with witty comments, including Dan Olson, who remarked, “Incredible technology, thrilled society invested a trillion dollars on this instead of sidewalks.”

Another participant, Crab Man, made it into a playful challenge: “New game for everyone: ask Google what a fictitious phrase means.”

Motivated by the trend, I decided to give it a shot. When I searched for the meaning of “don’t give me homemade ketchup and tell me it’s the good stuff,” Google’s AI opted not to provide an answer, stating that the feature wasn’t accessible. However, when I typed “you can’t shake hands with an old bear,” the AI confidently interpreted it as a caution regarding trusting unreliable individuals.

In this scenario, the AI’s propensity to invent is more amusing than harmful. However, it’s important to note that this same behavior has led to more serious mistakes — such as incorrectly reporting NFL overtime rules or, in earlier versions, recommending that people eat rocks or apply glue on pizza. Certain hallucinations, particularly in health-related queries, can be exceedingly perilous.

Google does provide a disclaimer indicating that AI Overviews may contain inaccuracies, but that hasn’t prevented them from featuring prominently in search results.

Thus, as a freshly minted idiom might suggest: Be cautious of search with AI — what you encounter could be fabricated.