TRM Labs identified nine new ransomware groups that extensively use AI to scam their victims.
Summary
- TRM Labs looked at how nine emerging ransomware groups leverage AI
- AI enables ransomware groups to massively scale up their operations
- Groups use AI tools to automate social engineering attacks, which are on the rise
Since the latest AI tools emerged, scammers have been using them in their attacks. On Monday, Oct. 6, blockchain analytics platform TRM Labs published a report on nine emerging ransomware groups and how they leverage AI.
These groups include Arkana Security, Dire Wolf, Frag, and Sarcoma, among others, which deploy different tactics and target different victims. However, what they have in common is the increasing use of AI in their ransomware operations.
How ransomware groups use AI in scams
The report noted that AI is becoming an integral part of ransomware operations. Notably, it enables these groups to massively scale their activities. Moreover, this technology is enabling new types of tactics, especially those that exploit the human element of security.
“Artificial intelligence is transforming the ransomware ecosystem — not just by making attacks more scalable, but by changing the playbook entirely,” said Ari Redbord, Global Head of Policy at TRM Labs. “We’re seeing faster operations, more sophisticated social engineering, and new tactics that rely on regulatory and reputational pressure instead of encryption. The line between financially motivated groups and state-linked actors is also becoming increasingly blurred.”
This specifically applies to social engineering scams, which used to be time-intensive and required extensive research and preparation. Now, ransomware attackers can leverage AI to write messages and create deepfake videos that are increasingly believable.
Scammers also use large language models (LLMs) to automate code generation, lowering the barrier to entry for attackers. AI also enables the creation of polymorphic malware, which changes with each infection, making detection much more difficult.