Edgar Cervantes / Android Authority
TL;DR
- A report suggests that ChatGPT has been leaking private conversations to unrelated individuals.
- These conversations include many sensitive details, such as usernames and passwords, unpublished works, and more.
- OpenAI’s investigation suggests this is not a “leak” of data.
Update, January 30, 2024 (02:20 PM ET): After publishing this article, OpenAI reached out to Android Authority with a statement explaining the situation. The entire statement is posted here, unedited:
“ArsTechnica published before our fraud and security teams were able to finish their investigation, and their reporting is unfortunately inaccurate. Based on our findings, the users’ account login credentials were compromised and a bad actor then used the account. The chat history and files being displayed are conversations from misuse of this account, and was not a case of ChatGPT showing another users’ history.”
Although this seems like an adequate explanation of the situation, we are leaving the original article unedited below for context. Ars has also updated its article to reflect Open AI’s explanation, but the publication stresses that the ChatGPT site offers no protections such as 2FA or the ability to review and track recent logins. These protections are standard on most platforms across the web now. The affected user also does not believe their account was compromised.
Original article, January 30, 2024 (07:56 AM ET): ChatGPT has become an important part of our workflow, often replacing even Google Search for many queries. Many of us use it for simpler queries, but with the help of ChatGPT plugins and ChatGPT extensions, you can use AI for more complex tasks. But we’d advise being careful about what you are using ChatGPT for and what data you share with it, as users have reported that ChatGPT has leaked a few private conversations.
According to a report from ArsTechnica, citing screenshots sent in by one of their readers, ChatGPT is leaking private conversations, including details like usernames and passwords. The reader had used ChatGPT for an unrelated query and curiously spotted additional conversations present in their chat history that did not belong to them.
These outsider conversations included several details. One set of conversations was by someone trying to troubleshoot problems through a support system used by employees of a pharmacy prescription drug portal, and it included the name of the app that the outsider was trying to troubleshoot, the store number where the problem occurred, and additional login credentials.
Another leaked conversation included the name of the presentation that someone was working on alongside details of an unpublished research proposal.
This is not the first time ChatGPT has leaked information. ArsTechnica notes that ChatGPT had a bug in March 2023 that leaked chat titles, while in November 2023, researchers were able to use queries to prompt the AI bot into divulging a lot of private data used in training the LLM.
OpenAI mentioned to ArsTechnica that the company was investigating the report. Irrespective of the results of the investigations, we would advise against sharing sensitive information with an AI bot, especially one that you did not create.