State-Backed Groups Using Gemini AI to Supercharge Attacks
In a startling new report, tech giant Google has sounded the alarm on how some of the world’s most dangerous hacking groups are now using Gemini AI as a digital weapon. No longer just a tool for writing emails or planning vacations, artificial intelligence is being repurposed by state-sponsored actors to scout targets, write malicious code, and execute cyberattacks with terrifying speed. This shift marks a new era in digital warfare where the “bad guys” are using the same high-tech assistants we use every day to break into the most secure systems on the planet.
North Korea’s Digital Recruitment Trap
One of the most active groups mentioned in the report is a North Korean unit known as UNC2970, which is often linked to the infamous Lazarus Group. These hackers have become notorious for “Operation Dream Job,” a long-running scam where they pose as recruiters to trick defense and aerospace employees into downloading malware. However, they have recently upgraded their tactics by using Gemini AI to do their homework. Instead of manually searching through profiles, they use the AI to scan public information and build “target dossiers” on high-value employees.
By asking the AI to map out specific job roles and salary data within major defense companies, these hackers can create incredibly convincing fake personas. They use Gemini to polish their phishing messages, making them look like professional, legitimate job offers that are almost impossible to distinguish from the real thing. This use of AI blurs the line between normal professional research and a premeditated digital ambush, making it harder for even seasoned experts to spot the trap.
A Global Surge in AI-Powered Hacking
North Korea isn’t the only player in this dangerous game. Google’s researchers have tracked several other groups—from China to Iran—who are all trying to get an edge using Gemini. For instance, Chinese groups like Mustang Panda and APT31 have been caught using the AI to gather intelligence on political targets and automate the hunt for software vulnerabilities. They often trick the AI by pretending to be “security researchers” or students participating in hacking competitions, a tactic used to bypass safety filters that would otherwise block malicious requests.
In Iran, the group known as APT42 has been using the AI to build custom tools, including a Python-based script for scraping Google Maps data and research for exploiting known software flaws. The report also highlights a particularly sneaky piece of malware called HONESTCUE. This software doesn’t actually carry its “instructions” with it. Instead, it reaches out to Gemini’s API while it’s running, asks the AI to write a specific piece of computer code, and then runs that code directly in the computer’s memory. Because the code is created on the fly and never saved as a file on the hard drive, traditional antivirus programs often fail to see it coming.
Stealing the “Brain” of the AI
Beyond just using AI to attack people, some hackers are now attacking the AI itself. Google revealed that it recently disrupted “model extraction attacks.” This is a high-tech form of industrial espionage where hackers send hundreds of thousands of questions to an AI model to see how it thinks. By recording the answers, they can essentially “reverse-engineer” the AI and build a cheap copycat version that behaves exactly like the original. One research group even proved that they could clone an AI’s behavior with 80% accuracy after only 1,000 questions.
Security experts warn that simply keeping the “secret sauce” of an AI hidden isn’t enough anymore. If a hacker can talk to the AI, they can copy it. This has created a “Defender’s Dilemma” where security teams are constantly playing catch-up. Google is fighting back by constantly updating its safety filters and building its own AI-based defenses that can react at “machine speed.” The message from the tech world is clear: as hackers get faster and smarter thanks to AI, the people protecting our data will have to do the same just to keep the lights on.
