Getting your Trinity Audio player ready...
|
Google revised a high-profile advertisement before broadcasting it during the Superbowl after users online spotted an error involving an AI-generated assessment of the popularity of gouda cheese.
The advert shows a Wisconsin cheese vendor using Google’s Gemini artificial intelligence tool to help create product descriptions.
It is part of a series of 50 such adverts showing how small businesses in the 50 US states can use AI.
In the original version, the AI-generated copy claimed gouda accounted for “50 to 60 percent of the world’s consumption” of cheese.

Cheese controversy
A use on social media quickly pointed out that the figure appeared to be “unequivocally false” and added that cheddar and mozzarella “would like a word”.
At first glance, the gaffe appeared to be an example of “hallucination”, the tendency of generative AI tools to create false information in cases where no genuine facts are available.
But Google Cloud executive Jerry Dischler said this was not the case, saying in a social media post that “multiple sites across the web include the 50-60 percent stat”.
Statistics on worldwide cheese consumption are in fact difficult to come across.
An article on cheese website cheese.com does make the claim referenced in Google’s advert, but it doesn’t cite a source for the figure and the numbers have been contested.
Google posted an updated version of the ad on YouTube, the video platform it owns, which edits out the statistic.
The company said it had asked the cheese vendor featured in the ad, the owner of the Wisconsin Cheese Mart, what he would have done in the situation.
“Following his suggestion to have Gemini rewrite the product description without the stat, we updated the UI to reflect what the business would do,” a Google spokesperson said.
Hallucinations
Generative AI tools’ hallucinative tendencies are one of their biggest drawbacks in a business setting, requiring any facts cited to be rigorously double-checked by a human.
Google scaled back the roll-out of its AI Summaries search feature last year after it suggested bizarre ideas such as using “non-toxic glue” to make cheese stick to pizza better, or claiming geologists advised people to eat one rock per day.
More recently, Apple last month was forced to suspend a feature generating AI notification summaries after it created multiple false headlines.