Mem0 + CrewAI User Personalization

I’m building a chatbot and using CrewAI and Mem0 as an external memory layer to personalize user interactions. My goal is to remember factual information about the user

However, I believe that User Memory recently got deprecated. After reading through the code, I found that only agents’ outputs are saved into external memory.

I want to process human inputs for the Mem0 facts collection. How do I do that? Can I manually push them into Mem0 somewhere? How can I add them into contextual memory then?

First off, welcome to the community, Anya! I’ll take this opportunity to document a new use case involving Mem0 integration with CrewAI.

The Error:

The earlier memory implementation using Mem0 had a bug, which we discussed in this thread. So, yesterday, version 0.114.0 was released, along with an example showcasing the new ExternalMemory usage. When I ran their example code, I got the following:

ValidationError: 1 validation error for Crew
  Value error, Please provide an OpenAI API key.

Great, we went from a buggy implementation (which required tricking the system) to a weird one (that has hidden dependencies). That’s progress, right? :smirking_face:

Where the Error Comes From:

Digging into the crewai/crew.py file, particularly the create_crew_memory validator, I noticed that this flag initializes several internal memory components like ShortTermMemory, LongTermMemory, and EntityMemory — even if you only want to use ExternalMemory.

And these other memory components rely on embeddings. So, you’re forced to provide an embedder configuration in your Crew setup to keep those components happy. Disclaimer: Honestly, I haven’t double-checked if that embedder gets configured automatically when the OPENAI_API_KEY environment variable is set. Since I ran into the error above, I’m documenting it here just in case.

Solution (Embedder Required for Other Memory Modules):

Below you’ll find a working example using Gemini, which also remembers user information.

from crewai import Agent, Task, Crew, LLM, Process
from crewai.memory.external.external_memory import ExternalMemory
import os

os.environ["MEM0_API_KEY"] = "YOUR-KEY"
os.environ["GEMINI_API_KEY"] = "YOUR-KEY"

gemini_llm = LLM(
    model='gemini/gemini-2.0-flash',
    temperature=0.5,
	max_tokens=1024
)

chatbot_agent = Agent(
    role="Friendly Chatbot",
    goal="Respond kindly to the user in a single paragraph.",
    backstory="A helpful AI assistant for brief interactions.",
    llm=gemini_llm,
    verbose=False,
    allow_delegation=False
)

chat_task = Task(
    description=(
        "Process the user query: '{user_question}' "
        "and provide a friendly response."
    ),
    expected_output="A single paragraph, friendly response.",
    agent=chatbot_agent
)

crew = Crew(
    agents=[chatbot_agent],
    tasks=[chat_task],
    verbose=False,
    process=Process.sequential,
    memory=True,
    embedder={
        "provider": "google",
        "config": {
            "model": "models/text-embedding-004",
            "api_key": os.environ["GEMINI_API_KEY"]
        }
    },
    external_memory=ExternalMemory(
        embedder_config={
            "provider": "mem0",
            "config": {
                "user_id": "MadMax"
            }
        }
    )
)

user_inputs = [
    "Hi, my name is Max Moura!",
    "I'm a fisherman.",
    "I really like soccer.",
    "Hey, what do you know about me?",
]

for user_input in user_inputs:
    result = crew.kickoff(
        inputs={
            "user_question": user_input
        }
    )
    print(f'\n👽 User:  {user_input}')
    print(f'🤖 Chatbot:  {result.raw}')

Hope this example helps you get closer to a solution!

Hello Max! Sorry if I wasn’t clear enough.

If we look at the code, we can see that into external memory, CrewAI saves the output, the task description and some metadata, but no user input. However, since I am developing a chatbot, I would like to save the user’s input into external memory, so that Mem0 would be able to retrieve factual information from it, and not store the output for each agent (only the final output). Would that be possible?

Anya, when I run the code above, take a look at what happens in Mem0 (pay close attention to the highlighted part):

You’ll notice that it saved information I provided through my Crew (chat) input, linked that data to the example user in Mem0, and then pulled up those personal details in my last interaction.

Of course, you can also add information manually, just like it was done in that other thread I mentioned earlier:

from mem0 import MemoryClient
import os

os.environ['MEM0_API_KEY'] = ''

client = MemoryClient()
messages = [
    {
        'role': 'user', 
        'content': 'Hi! I\'m planning a vacation and could use some advice.'
    },
    {
        'role': 'assistant', 
        'content': 'Hello! I\'d be happy to help with your vacation planning. '
                   'What kind of destination do you prefer?'
    },
    {
        'role': 'user', 
        'content': 'I am more of a beach person than a mountain person.'
    },
    {
        'role': 'assistant', 
        'content': 'That\'s interesting. Do you like hotels or Airbnb?'
    },
    {
        'role': 'user', 
        'content': 'I like Airbnb more.'
    }
]
client.add(messages, user_id='John Doe', output_format='v1.1')

Maybe if you could share a snippet of your use case, it would be easier for us to collaborate and help out.

1 Like

One thing to note is that Mem0 doesn’t save everything a user sends to it. For example sending “hello” won’t create a Mem0 memory.

You need to send messages that are actually memory worthy for example preferences eg I enjoy watching football. Not saying that’s what’s happening but it might be related

2 Likes

Hello Max, thanks for reporting this issue related to Embedding.
Hosnestly, we didnt notice this issue the OPEN API key is required because we need one llm running, at least - openai is default one.
Regarding to embedder issue, i will take a look on that to ensure this is the desired behavior

1 Like

Well we have a couple key points here:

  1. Currently, memories can not be defined isolated way.
  2. You got an error saying OPENAI Api KEY is not defined because it is required for long term memory (embedding)

The second point is a side effect of the first point, right!? Given that I’m planning to push a PR support eaiser memory defnitions by next Monday.

Feel free to report those issues on Github repo, really appreciate that!

ty!