Over 100k ChatGPT accounts stolen in dark web hacker attack
Over 100,000 ChatGPT accounts have been stolen by hackers selling credentials on the dark web since last year.
Cybersecurity researchers have revealed that over 100,000 ChatGPT accounts have been stolen by malicious hackers, who then sold the credentials on the dark web. The hacking vector was malware-infected devices according to cybersecurity research firm Group-IB.
Over 101,000 accounts were compromised over the course of one year and primarily targeted Asia, with over 41,000 accounts sold. Meanwhile, just under 3,000 accounts based in the US are affected.
How were the accounts compromised?
According to Group-IB, the attack vector was through login-stealing Malware, which scrubs a web browser for sensitive information such as saved passwords, which the software can also decrypt. This means that attackers were able to steal the ChatGPT logins without breaching OpenAI itself, which could have been a much trickier task.
The attack vectors included Racoon, Vidar, and Redline, which all use similar methods to steal information. ChatGPT’s own standard security measures could be another threat to users who may be inputting sensitive information into the tool, too.
Group-IB recommends that users should change passwords regularly, in addition to setting two-factor authentication onto their accounts, according to TechRadar.
Companies such as Google and Samsung have banned engineers from inputting code into generative AI ChatGPT for this very reason, data storage on these platforms remains to be incredibly risky, especially if someone else can access your account, or if ChatGPT itself gets compromised. Should attackers gain access to chat history, it could be a way for them to view code for internal tools, and potentially help hackers find a way into key systems once they are live.
This is why it’s always good to ensure that any sensitive data is scrubbed from ChatGPT and that the chat saving feature remains disabled on the majority of accounts. Despite the tools learning from these queries, in the first place, they may also pose a potential security risk.
Be sure to read our guide on ChatGPT privacy to ensure that your information is not being stored by the tool.