ChatGPT Plus remembers everything you forgot you told it to remember
Unless you live in Europe or Korea
OpenAI's Memory feature is now broadly available for ChatGPT Plus users, meaning many more can feel vaguely uncomfortable about how much the chatbot is "remembering" about their preferences.
The feature was first introduced in February to a small portion of ChatGPT free and Plus users. According to OpenAI, it was designed to "remember" things discussed in chats to save the user from having to repeat information in future conversations with the chatbot.
It has now been made more broadly available, with some notable exceptions.
Memory is not available to users in Europe or Korea. We can certainly imagine regulators taking a very dim view of the data captured, where it is stored, and to what use it might be put. The description for the Memory functionality includes the text: "We may use content that you provide to ChatGPT, including memories, to improve our models for everyone."
OpenAI did not give a reason for the omission of Europe and Korea. We will update this article if the company responds to our query.
- OpenAI slapped with GDPR complaint: How do you correct your work?
- Jensen Huang and Sam Altman among tech chiefs invited to federal AI Safety Board
- Cops cuff man for allegedly framing colleague with AI-generated hate speech clip
- Meta's value plummets as Zuckerberg admits AI needs more time and money
Examples given included ChatGPT remembering that the user prefers meeting notes to have headlines or that the user's toddler loves jellyfish and will suggest a jellyfish wearing a party hat when creating a birthday card.
The limited test has gone well enough for the company to roll out the functionality more broadly, although, based on feedback, ChatGPT will let the user know when "memories" are updated. It is also easier to access all memories when updates occur and "forget" anything unwanted.
It is also possible to turn off the functionality entirely if required.
The company will not use content from ChatGPT Team and Enterprise customers for training purposes, although it said those customers would eventually get access to memory functionality. It also said it would "steer" ChatGPT away from proactively remembering sensitive information, such as health details, unless the chatbot is explicitly told to do so.
Interestingly, while the plan is for GPTs also to have their own memory, it won't be shared. For example, if ChatGPT knows a significant other is a big fan of kittens, that information won't find its way into the Artful Greeting Card GPT or vice versa. ®