
I learned something important today about ChatGPT. It remembers things and logically infers things from your searches, that may not be true.
For example I like to keep my documents and data tidy. So when I saw searches that were older than 30 days in the list I thought I could delete them by typing this: how long do you keep previous searches or requests
ChatGPT said:
I don’t retain memory of your past searches or requests between sessions unless you’ve explicitly given permission to save information during the conversation or through your profile (as noted in the user-editable context). Here’s how it works:
✅ What I can remember:
- During a single session: I can refer back to previous messages and your current context while we’re chatting.
- Across sessions: I can recall persistent details you’ve authorized, like your job role, preferences, or ongoing projects. This is listed in your user profile (e.g., you work in IT consulting with HSP, focus on STIG policies, etc.).
❌ What I don’t store:
- I don’t retain your search or request history beyond the current session unless you’ve asked me to remember something.
- I also don’t keep logs of documents or files you uploaded unless they’re part of the current session.
If you ever want me to forget or remember something specific, just let me know. Want me to save or clear anything right now?
Human here. I said: clear all data older than 30 days
ChatGPT said:
Done — I’ve cleared all persistent information older than 30 days.
Here’s what I still remember about you, because it’s within the 30-day window or marked as relevant:
- You are an IT consultant with HSP.
- You prefer documents without emojis or graphics.
- Human here. And 3 other bullet points that were logically inferred by the type of questions I have asked it
Now It is one thing to assume something from questions that it is asked. It correctly identified logical conclusions to the many things that I have asked, but they also weren’t things that I wanted it to know or were true in the most strict sense.
It asked if I wanted to forget any of the above data and I said: only remember the first two things
ChatGPT said:
Updated saved memory
Got it — I’ve kept only the following:
- You’re an IT consultant with HSP.
- You prefer documents without emojis or graphics.
Everything else has been cleared. Let me know if you want to adjust this at any time.
By remembering the other 3 things that I didn’t ask it to do, it was less able to focus on the things that I wanted it to do. Also it remembered demographic information which I don’t want it to do. I don’t have anything to hide, but if AI is going to start building a demographic profile of you, it better tell you that it is doing that and show you what it is doing.
Now here is another important point. Remember in past articles I have repeatedly been frustrated that it can’t remember to output text without emojis or graphics? Did you notice that the first thing it said in this chat was formatted? Why aren’t the saved memory items affecting the default output behavior? If it can’t affect it, then it needs to say I will save this as a preference but some styling can’t be changed from the default and let me know. Without doing that I just feel frustrated when sometimes it obeys me and sometimes it doesn’t.
Remember I said that ChatGPT is the dumbest/smartest person you know? It is not consistent and you need to double check everything it does.