Hallucinations are real: Google’s AI search tool is telling users to eat glue and rocks

Are ‘AI Overviews’ Search Results Making Things Up? The “Glue on Pizza” Problem

AI Overviews are AI-generated snapshots for Google Search results. These snapshots help users search faster by sharing baseline information about a topic, along with links for learning more about the topic, like the difference between dark and medium roast coffee

  • Shortly after launch, users shared bizarre and incorrect answers from Google’s new ‘AI Overviews’ feature.
  • Viral examples included suggestions to add “non-toxic glue” to pizza sauce and that astronauts had met cats on the moon.
  • The incidents showed the ongoing “hallucination” problem with LLMs when deployed at a massive scale.

Google’s big AI push hit a snag when social media was flooded with screenshots of its AI Overviews feature giving dangerously silly advice.

The model, it turned out, was pulling from satirical content and old forum jokes as if they were facts, a classic LLM hallucination.

Some of the answers appeared to be based on Reddit comments or articles written by satirical site, The Onion.

A Google spokesperson said they were “isolated examples”.

They have been widely mocked on social media.

But Google insisted the feature was generally working well.

“The examples we’ve seen are generally very uncommon queries, and aren’t representative of most people’s experiences,” it said in a statement.

“The vast majority of AI overviews provide high quality information, with links to dig deeper on the web.”

While Google said these were isolated examples and that it would make quick fixes, the event was a major public relations stumble. It served as a powerful reminder that while AI is powerful, it’s not yet reliable enough to be fully trusted without human oversight.

Source: BBC, Published: May 23, 2024