Artificial Intelligence: A technical glitch in the chatbot “ChatGPT” causes users’ conversations to be leaked
The head of the chatbot, dubbed “ChatGPT”, said that a technical glitch in the AI-powered bot allowed some users to see the titles of other users’ conversations.
It comes after users on social media sites Reddit and Twitter shared images of chat logs that they said did not belong to them.
OpenAI CEO Sam Altman said the company felt “hateful”, but the “big” bug has now been fixed.
Despite this, many users remain concerned about privacy while using the GPT Chat platform.
Each chat conducted on the GBT Chat platform is stored in the user’s chat history tape, where it can be referred to later. But early Monday, users started seeing conversations appearing in their history that they said they didn’t have via ChatGPT.
A user on the news and social networking site Reddit shared a picture of his chat logs, including titles such as “Developing Chinese Socialism” as well as conversations in Mandarin.
On Tuesday, ChatGBT told Bloomberg that it briefly disabled the chatbot late Monday to fix the glitch.
The company also said that users were unable to access the chat files themselves. The CEO of OpenAI also tweeted, saying there would be a “technical autopsy” soon. However, this flaw has worried users who fear that their private information will be revealed through chat history.
The glitch appears to indicate that OpenAI has access to the user’s conversations. The company’s privacy policy states that user data consisting of prompts and answers can be used to further train the ChatGBT chatbot model. However, this data is not used, except after removing personally identifiable information.
The blunder also comes just a day after Google unveiled its “Cool” chatbot to a group of beta testers and journalists.
Google and Microsoft, the two main investors in OpenAI, are vying for control of the booming market for artificial intelligence tools. But the pace of new product updates and releases raises several concerns about taking wrong steps, which may be harmful or have unintended consequences.
0 Comments