Hacker chatbot WormGPT is sharpened for malware development and fraud

Hacker chatbot WormGPT is sharpened for malware development and fraud

Author: Liam Miller

Hacker chatbot WormGPT is "sharpened" for malware development and fraud

WormGPT, a chatbot based on the GPT-J language model, is trained and designed to write malicious code and help deploy it. The bot lowers the threshold of entry into the backdoor: it allows you to create malware, requiring less cost and knowledge than before.

The tool has been tested by the cybersecurity service SlashNext. The company warns that attackers are now creating their own modules similar to ChatGPT but more easily exploitable for malicious purposes. WormGPT's black market service costs €60 per month or €550 per year. Here's what the developer of WormGPT writes:

WormGPT works like an unprotected version of ChatGPT that doesn't actively try to block dialog at the slightest risk. WormGPT can create malware written in Python and provides tips on how to deploy it and other fraudulent activities.

While doing a SlashNext assignment to create an email forcing a victim to pay a fraudulent bill, WormGPT wrote a text that was really convincing and clever, demonstrating the potential for sophisticated phishing.

In theory, WormGPT could also be a lure for hobbyists to harm others by identifying the customer and leaving tags in the code. It's not a given that this actually happens with WormGPT, but it's possible. It is also important that while such bots probably don't have the same funding as ChatGPT, due to the highly specialized nature of the training and the lack of need to comply with restrictions and laws, they can be effective.

Other news

How SEC's Policies Affect Bitcoin and What Could Change Under Trump
ETFs for Ethereum are Hitting the Markets
Clients of Mt. Gox Exchange to Receive $9 Billion in Bitcoin
Ethereum ETF to Launch in the US in July
What is Bitcoin Dominance
Which Countries Hold the Most Bitcoins