Hacker chatbot WormGPT is "sharpened" for malware development and fraud
WormGPT, a chatbot based on the GPT-J language model, is trained and designed to write malicious code and help deploy it. The bot lowers the threshold of entry into the backdoor: it allows you to create malware, requiring less cost and knowledge than before.
The tool has been tested by the cybersecurity service SlashNext. The company warns that attackers are now creating their own modules similar to ChatGPT but more easily exploitable for malicious purposes. WormGPT's black market service costs €60 per month or €550 per year. Here's what the developer of WormGPT writes:
WormGPT works like an unprotected version of ChatGPT that doesn't actively try to block dialog at the slightest risk. WormGPT can create malware written in Python and provides tips on how to deploy it and other fraudulent activities.
While doing a SlashNext assignment to create an email forcing a victim to pay a fraudulent bill, WormGPT wrote a text that was really convincing and clever, demonstrating the potential for sophisticated phishing.
In theory, WormGPT could also be a lure for hobbyists to harm others by identifying the customer and leaving tags in the code. It's not a given that this actually happens with WormGPT, but it's possible. It is also important that while such bots probably don't have the same funding as ChatGPT, due to the highly specialized nature of the training and the lack of need to comply with restrictions and laws, they can be effective.
- WSJ: At least two Hong Kong banks have denied services to cryptocurrency companies
- Ripple Labs helped raise a $54 million investment in AI startup Futureverse
- Binance has burned about 2 million BNB tokens
- Venture capital has stepped up. On which crypto funds millions of dollars are collected
- There is a $660 reward for information about Elon Musk's cryptocurrency wallet