This article is more than 1 year old

Cybercrims hop geofences, clamor for stolen ChatGPT Plus accounts

Where there's a will…

The market for stolen ChatGPT accounts, and especially Plus subscriptions, is on the rise as miscreants in countries blocked by OpenAI try to hop the chatbot's geofences.

This uptick began in March, according to Check Point bods who say they've noticed an "increase in the chatter in underground forums related to leaking or selling compromised ChatGPT premium accounts."

By "premium" accounts, they mean ChatGPT Plus: the subscription service that costs $20 per month and gives users access to new features and faster response times, compared to those using the free service.

While most of the stolen accounts are offered for sale, some criminals will share compromised premium accounts "to advertise their own services or tools to steal the accounts," the security shop said.

Russia, China, and Iran are among a handful of countries banned from using OpenAI, but that hasn't stopped miscreants from blacklisted nations from looking for ways to skirt the rules, and use the AI technology powering ChatGPT to advance their operations. 

The chatbot can be used to produce text for phishing and other online scams, helping criminals craft emails and other messages to trick their victims into handing over their usernames and passwords. 

It can also be used to generate trivial malware that manages to infect naive or poorly defended networks, thus making hacking more cost-efficient, Sergey Shykevich, threat intelligence group manager at Check Point, told The Register in an earlier interview.

"It allows people that have zero knowledge in development to code malicious tools and easily to become an alleged developer," Shykevich said. "It simply lowers the bar to become a cybercriminal."

In addition to advancing these types of criminal pursuits, stolen ChatGPT accounts present another potential privacy risk, according to the research. Namely: the accounts store the recent queries generated by the account owner.

This means when a criminal accesses someone else's account, they can see these queries, which may include personal information and corporate details — despite companies' warnings to employees not to feed sensitive info to the chatbot.

One of the ways crooks are stealing and selling ChatGPT accounts is by using account checkers and bruteforcing tools, the security team found. In one example, they found a configuration file for SilverBullet for sale.

SilverBullet is yet another software tool that has both legitimate and criminal uses: it's a web-testing suite that allows users to scrape data and automate penetration testing on a target web app. But it's also a favorite among criminals for credential stuffing and account attacks to steal login details.  

In this specific case, the researchers spotted someone selling a configuration file for SilverBullet that allows automated credential checks for ChatGPT. The software can initiate between 50 and 200 checks per minute, and also supports proxy implementation, which helps bypass protections against bruteforce attacks.

Another criminal who goes by "gpt4" on cybercrime forums not only sells ChatGPT accounts, but also claims to have a configuration for an automated tool that checks credentials, the researchers said.

And in a third example, they spotted an ad for "ChatGPT Plus lifetime account service," where the seller guarantees the buyers "100 percent satisfaction."

The lifetime upgrade of a regular ChatGPT Plus account costs $59.00 (as a reminter: the legitimate service via OpenAI costs $20 per month). But for criminals that want to cut costs, there's also the option to share access to a ChatGPT account with another miscreant for the bargain lifetime price of $24.99.

"A number of underground users have already left positive feedback for this service, and have vouched for it," according to Check Point's crew. 

This, apparently, proves that even in the criminal underground, reviews matter. ®

More about

TIP US OFF

Send us news


Other stories you might like