I’ve been using ChatGPT Plus for a while now, and I really like its memory feature.
I tried Gemini Advanced, and after a lot of back-and-forth, it finally admitted it doesn’t have memory. That was frustrating, especially since it feels like one of the weaker models out there, and the censorship is ridiculous.
It looks like Claude Pro doesn’t have memory either. While it’s very smart, even comparable to ChatGPT Plus, the lack of memory and usage limits are making me rethink keeping my subscription.
Are there any other AI models—free or paid—that actually offer memory? I’d love to know if there’s something better out there or if ChatGPT Plus is still the only one.
AI models don’t actually have memory. Every time they generate a response, they only know what they’ve been fed during that session. The app you’re using—like ChatGPT’s interface—just gives the model your past messages to work with, but there’s a limit to how much it can include at once.
ChatGPT 4 can process about 200 A4 pages worth of text in one go. Gemini and other models are in a similar range.
Whether a model can reference past conversations depends on how the app is designed. Some apps analyze your chats to pick out key details like your name or important facts and use those in future sessions to create the appearance of memory. Others, like Claude, store your messages in a database and search for relevant ones to include in a response. It’s not real memory, but it gets the job done for practical purposes.
There are experiments with models like Meta’s Megalodon that might allow for unlimited memory, but those are still a work in progress. They tend to lose accuracy when trying to recall older details.
For now, ChatGPT’s app seems to handle this better than others because of the way it simulates memory. It’s not perfect, but it works better than most alternatives.
If you go to Settings > Personalization > Manage Memory, you can see what ChatGPT has “remembered.” It just sends these details with every response, probably because most users don’t hit the context limit.
This isn’t much different from manually copying and pasting those same details into a fresh chat or even into a Claude project. It’s more of a convenience than anything else (and can be problematic if incorrect details are remembered).
In a way, Claude’s project setup is just a more hands-on version of ChatGPT’s memory feature.
If you’re waiting for an alternative, you might need to hold on until future updates or new models come out. Right now, it’s still a pretty rare feature.
Juno said:
If you’re waiting for an alternative, you might need to hold on until future updates or new models come out. Right now, it’s still a pretty rare feature.
That’s what I thought. If you had to guess, which model do you think will add memory next, and when?
I’ve been working on something that could help with this. My project, quizmydata.com, lets you store your data in a database and retrieve it based on relevance. It sends the most relevant pieces to the AI model to help answer your questions.
Would saving your previous conversations and responses for ongoing use solve the problem for you? It could essentially act like long-term memory.
As far as I know, ChatGPT is the only one with this feature right now. But here’s a workaround: at the end of a conversation, have Claude summarize any important points or facts. Save that summary in a text file. Then, at the start of your next conversation, upload that file to Claude.
This gives you control over what it ‘remembers,’ and it’s more private too.
Any chatbot with a large enough context window can do this, but you might have to handle it manually. Just save any standard details you want the bot to use and include them in your prompts.