Edgar Cervantes/Android Authority
long story short
- A report suggests that ChatGPT has been leaking private conversations to unrelated individuals.
- These conversations include many sensitive details such as usernames and passwords, unpublished work, and more.
- OpenAI’s investigation revealed that this was not a data “breach.”
Update, January 30, 2024 (02:20 PM ET): After publishing this article, OpenAI contacted Android Authority and issue a statement explaining the situation. The full statement is published unedited here:
art technology Published before our fraud and security team completed an investigation and unfortunately their report was inaccurate. Based on our findings, a user’s account login credentials were compromised and then a bad actor used the account. The chat history and files shown are conversations that abused the account, rather than ChatGPT showing the history of other users.
While this appears to be an adequate explanation of the situation, we’ll leave the original article unedited below for the sake of context.We will definitely update again if: als Retract or otherwise edit your own article.
Original article, January 30, 2024 (07:56 AM ET): ChatGPT has become an important part of our workflow, often even replacing Google search for many queries. Many of us use it for simpler queries, but with the help of the ChatGPT plugin and ChatGPT extension, you can use AI to perform more complex tasks. But we recommend that you be cautious about what you use ChatGPT for and the data you share with it, as users have reported that ChatGPT leaked some private conversations.
according to a report art technology, citing a screenshot sent in by one of its readers, ChatGPT is leaking private conversations, including details like usernames and passwords. Readers use ChatGPT to make unrelated queries and are curious to discover that there are other conversations in their chat history that do not belong to them.
These outsider conversations include several details. One set of conversations is someone trying to resolve an issue through a support system used by Pharmacy Prescription Portal employees, which includes the name of the application the outsider is trying to resolve, the store number where the issue occurred, and additional login credentials.
Another leaked conversation included the name of a presentation someone was working on and details of an unpublished research proposal.
This is not the first time ChatGPT has leaked information. art technology pointed out that a vulnerability in ChatGPT in March 2023 resulted in the leakage of chat headers, and in November 2023, researchers were able to use queries to prompt an AI bot to leak large amounts of private data used to train LLM.
OpenAI mentioned art technology The company is investigating the report. Regardless of the outcome of the investigation, we recommend not sharing sensitive information with AI bots, especially bots you did not create.