You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Mem0 ("mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. It remembers user preferences, adapts to individual needs, and continuously learns over timeβideal for customer support chatbots, AI assistants, and autonomous systems.
Key Features & Use Cases
Core Capabilities:
Multi-Level Memory: Seamlessly retains User, Session, and Agent state with adaptive personalization
Developer-Friendly: Intuitive API, cross-platform SDKs, and a fully managed service option
Applications:
AI Assistants: Consistent, context-rich conversations
Customer Support: Recall past tickets and user history for tailored help
Healthcare: Track patient preferences and history for personalized care
Productivity & Gaming: Adaptive workflows and environments based on user behavior
π Quickstart Guide
Choose between our hosted platform or self-hosted package:
Hosted Platform
Get up and running in minutes with automatic updates, analytics, and enterprise security.
Mem0 requires an LLM to function, with gpt-4o-mini from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.
First step is to instantiate the memory:
fromopenaiimportOpenAIfrommem0importMemoryopenai_client=OpenAI()
memory=Memory()
defchat_with_memories(message: str, user_id: str="default_user") ->str:
# Retrieve relevant memoriesrelevant_memories=memory.search(query=message, user_id=user_id, limit=3)
memories_str="\n".join(f"- {entry['memory']}"forentryinrelevant_memories["results"])
# Generate Assistant responsesystem_prompt=f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"messages= [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
response=openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
assistant_response=response.choices[0].message.content# Create new memories from the conversationmessages.append({"role": "assistant", "content": assistant_response})
memory.add(messages, user_id=user_id)
returnassistant_responsedefmain():
print("Chat with AI (type 'exit' to quit)")
whileTrue:
user_input=input("You: ").strip()
ifuser_input.lower() =='exit':
print("Goodbye!")
breakprint(f"AI: {chat_with_memories(user_input)}")
if__name__=="__main__":
main()
@article{mem0,
title={Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory},
author={Chhikara, Prateek and Khant, Dev and Aryan, Saket and Singh, Taranjeet and Yadav, Deshraj},
journal={arXiv preprint arXiv:2504.19413},
year={2025}
}