Google’s AI suggests odd ideas: glue on pizza, eating rocks, and making chlorine gas.

People on social media are talking a lot about some weird advice they’re getting from Google’s new smart tool.

This tool is supposed to help find things on the internet, but sometimes it gives out unsafe ideas. Like, it’s telling people to eat rocks or put glue on pizza, which doesn’t sound very smart.

Google released a new search tool called “AI Overviews” in the US. It uses something called AI to quickly explain search results. This sounds great, but there’s a catch! Sometimes the summaries come from joke websites, like articles you wouldn’t believe or funny online comments. Because of this, the summaries can be misleading or just plain silly.

But it’s not just about safety. Sometimes this tool gets facts wrong. For example, it says Barack Obama is Muslim when he’s not, or that a dog played sports professionally, which isn’t true. These mistakes are worrying people. They’re wondering if they can trust this tool to give them good information.

Leave a Reply

Your email address will not be published. Required fields are marked *