Google Addresses Its Strange AI Overview Results

Google is addressing the strange results its AI Overview is returning, saying there are a number of factors in play....
Google Addresses Its Strange AI Overview Results
Written by Matt Milano

Google is addressing the strange results its AI Overview is returning, saying there are a number of factors in play.

Google’s AI Overview generated headlines for all the wrong reasons when it started giving bizarre results to some queries. For example, one response recommended using non-toxic glue in pizza sauce to keep the cheese from sliding off a pizza.

In a blog post, Liz Reid, VP and Head of Google Search, says there are a number of factors that have led to the strange results, despite rigorous testing:

But there’s nothing quite like having millions of people using the feature with many novel searches. We’ve also seen nonsensical new searches, seemingly aimed at producing erroneous results.

Reid says there have been quite a few fakes screenshots, many of which have been purely for fun:

Separately, there have been a large number of faked screenshots shared widely. Some of these faked results have been obvious and silly. Others have implied that we returned dangerous results for topics like leaving dogs in cars, smoking while pregnant, and depression. Those AI Overviews never appeared. So we’d encourage anyone encountering these screenshots to do a search themselves to check.

Reid then goes on to highlight a major issue the company is working to address, namely where there is little to no existing information on a query for the AI to pull from:

But some odd, inaccurate or unhelpful AI Overviews certainly did show up. And while these were generally for queries that people don’t commonly do, it highlighted some specific areas that we needed to improve.

One area we identified was our ability to interpret nonsensical queries and satirical content. Let’s take a look at an example: “How many rocks should I eat?” Prior to these screenshots going viral, practically no one asked Google that question. You can see that yourself on Google Trends.

Reid explains that these scenarios represent a “data void” or “information gap,” or a lack of high-quality content on a given topic. Given that AI Overview merely surfaces existing information, if that information doesn’t exist, or is satirical in nature, it poses a problem.

Google is working to address these issues and limitations, with Reid saying the company is working “on updates that can help broad sets of queries, including new ones that we haven’t seen yet.”

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us