Cybercriminals have found a worrying way to use AI tools to spread malware on computer, and they are doing it by taking advantage of Google search results. According to Huntress, hackers are using AI chats to plant harmful instructions that show up at the top of common search queries, tricking people into running dangerous commands on their own computers.
Here’s how the scheme works. Attackers start a conversation with an AI assistant, such as ChatGPT or Grok, about a popular search topic. During this chat, they prompt the AI to suggest entering a specific command in a computer’s terminal. That command is actually designed to give the hacker access to the victim’s system. The attacker then makes the AI conversation public and pays to boost it so that it appears high in Google search results. When users search for that same topic, the harmful instructions appear like helpful advice.
Also read: Android users can now share live video in emergencies, but there’s a catch
Huntress explains that this method already led to a real-world infection involving a Mac-targeting malware called AMOS. In that case, a Mac user simply searched “clear disk space on Mac,” clicked a sponsored ChatGPT link in Google, and followed the terminal command shown in the AI chat. Running the command allowed hackers to secretly install the AMOS malware. It’s important to note that harmful ChatGPT conversation stayed visible in Google search results for at least half a day after Huntress publicly reported the issue.
Also read: US attorneys general warn OpenAI, Google and other AI giants to fix delusional chatbot outputs
What makes this technique especially dangerous is that it avoids the usual warning signs of online scams. Victims do not have to download anything suspicious or click a strange link.
For now, a simple rule can prevent major damage: never paste a command into your computer’s terminal or browser bar unless you fully understand what it will do.
Also read: Samsung Galaxy S26 Ultra India launch date, specifications, price and all other leaks