I’ve been working on Memoripy, a Python library that adds real memory capabilities to AI applications. If you’re working on conversational AI, virtual assistants, or anything that needs to remember things over time, Memoripy is designed to help with both short-term and long-term memory storage.
Memoripy organizes memory into two types: short-term and long-term. It keeps recent events front and center but also remembers important details for the future. This way, the AI doesn’t get overwhelmed and can keep the conversation relevant.
Memoripy uses semantic clustering, which groups similar memories together to help the AI quickly find the right context. It also has memory decay and reinforcement features, which means old or unused memories fade, and important ones stick around.
A cool thing about Memoripy is that it focuses on local storage, making it a great option for developers who want to keep things private. It works well with locally hosted LLMs, but also integrates with OpenAI and Ollama.
If this sounds interesting, check it out on GitHub! It’s open-source, and I’d love to hear your thoughts or any ideas on how you might use it.