The cybercriminal version of ChatGPT

When ChatGPT was made available to the public on November 30, 2002, the AI chatbot took the world by storm.

The software was developed by OpenAI, an AI and research company. ChatGPT is a natural language processing tool able to answer queries and provide information based on datasets gleaned from datasets, including books and online web pages, and has since become a valued tool for on-the-fly information gathering, analysis, and writing tasks for millions of users worldwide.

While some experts believe the technology could prove to reach an internet level of disruption, others note that ChatGPT demonstrates ‘confident inaccuracy.’ Students in droves have been caught plagiarizing coursework by way of the tool, and unless datasets are verified, tools such as ChatGPT could become unwitting tools to spread misinformation and propaganda.

Indeed, the US Federal Trade Commission (FTC) is investigating Open AI over its handling of personal information and the data used to create its language model.

Beyond data protection concerns, however, whenever a new technological innovation is made, so are pathways for abuse. It was only a matter of time before the AI chatbot was emulated for malicious purposes — and one such tool is now on the market, known as WormGPT.

What is WormGPT?

On July 13, researchers from cybersecurity firm SlashNext published a blog post revealing the discovery of WormGPT, a tool being promoted for sale on a hacker forum.

According to the forum user, the WormGPT project aims to be a blackhat “alternative” to ChatGPT, “one that lets you do all sorts of illegal stuff and easily sell it online in the future.”

SlashNext gained access to the tool, described as an AI module based on the GPTJ language model. WormGPT has allegedly been trained with data sources including malware-related information — but the specific datasets remain known only to WormGPT’s author.

It may be possible for WormGPT to generate malicious code, for example, or convincing phishing emails.

What is WormGPT being used for?

WormGTP is described as “similar to ChatGPT but has no ethical boundaries or limitations.”

ChatGPT has a set of rules in place to try and stop users from abusing the chatbot unethically. This includes refusing to complete tasks related to criminality and malware. However, users are constantly finding ways to circumvent these limitations.

The researchers were able to use WormGPT to “generate an email intended to pressure an unsuspecting account manager into paying a fraudulent invoice.” The team was surprised at how well the language model managed the task, branding the result “remarkably persuasive [and] also strategically cunning.”

July 19, 2023

Written by Charlie Osbourne

Click to read the entire article on ZDNet

More Posts

May 11 through 14, 2026

This week I will be working on completing adding pages to the bbn-net.com website. I will also be working on my online courses. The information

May 04 through 07, 2026

This week I will be preparing the April monthly report, This week I will be monitoring the bbn-net.com website to improve the SEO performance of

April 27 through 30, 2026

This week I will be preparing for the end of month. This week I will be preparing for the end of month. I will be

April 20 through 23, 2026

This week I will be adding pages and content to the bbn-net.com website. These additions will have no impact on site functions but will simply