ChatGPT leaked private conversations, including details such as usernames and passwords, according to an ArsTechnica report. One chatbot user made a query to ChatGPT and noticed additional chats that did not belong to him. According to OpenAI, this is not a data leak.

Stranger conversations contained several details. One set of chats was by someone trying to resolve issues through the pharmacy’s prescription portal employee support system. It included the name of the app the outsider was trying to troubleshoot, the store number where the problem occurred and additional login credentials.

Another leaked conversation included the name of a presentation someone was working on, along with details of an unpublished research proposal.

This is not the first time ChatGPT information has been leaked. ArsTechnica notes that in March 2023, ChatGPT had a bug that leaked chat headers, while in November 2023, researchers were able to use queries to get an AI bot to reveal private data used to train an LLM.

OpenAI issued the following statement regarding the incident. “ArsTechnica published the incident before our fraud and security teams could complete their investigation, and the publication’s report is unfortunately inaccurate. Based on our findings, the user’s account login credentials were compromised and the account was accessed by a malicious user. The chat history and files displayed are conversations resulting from misuse of this account, and ChatGPT does not show history of other users.”

Source: Android Authority