The Problem and Some Theory
Tracing the error, I noticed that when running a Task
, the Agent
tries to build context using ContextualMemory
. This involves fetching user context via _fetch_user_context
, which in turn calls self.um.search(query)
. However, self.um
(which should be an instance of UserMemory
) is None
, leading to the AttributeError: 'NoneType' object has no attribute 'search'
error we’re seeing.
The problem originates during the initialization of the Crew
class. When the create_crew_memory
method is called, it does the following:
- It checks
self.memory
. Since we’re passing the parametermemory=True
, this evaluates toTrue
. - It then initializes
_long_term_memory
,_short_term_memory
, and_entity_memory
. From what I’ve seen, this part works correctly and doesn’t affect our issue, so let’s move on. - Crucially, it then checks:
if self.memory_config and "user_memory" in self.memory_config
, and here lies the bug. If the condition is NOT met (which is the case with the Mem0 example config), it explicitly setsself._user_memory = None
. This is the direct cause ofself.um
beingNone
later on.
After this point, a ContextualMemory
object is created with: self.crew._short_term_memory
(),
self.crew._long_term_memory
(),
self.crew._entity_memory
(), and
self.crew._user_memory
().
And that, in a nutshell, is how self.um
becomes None
, causing the self.um.search(query)
method to throw the error (goes without saying this should probably be wrapped in a try block, right?).
A Workaround for Mem0 (Includes Functional Code)
As you saw above, we need to force the check if self.memory_config and "user_memory" in self.memory_config
to evaluate to True
. Therefore, we need to ensure the user_memory
key is present in the dictionary passed to the memory_config
parameter.
Here’s the functional code. Hope it helps you guys out.
from crewai import Agent, Task, Crew, LLM, Process
from mem0 import MemoryClient
import os
os.environ['MEM0_API_KEY'] = ''
os.environ['GEMINI_API_KEY'] = ''
# Step 1: Record preferences based on user input
client = MemoryClient()
messages = [
{
'role': 'user',
'content': 'Hi! I\'m planning a vacation and could use some advice.'
},
{
'role': 'assistant',
'content': 'Hello! I\'d be happy to help with your vacation planning. '
'What kind of destination do you prefer?'
},
{
'role': 'user',
'content': 'I am more of a beach person than a mountain person.'
},
{
'role': 'assistant',
'content': 'That\'s interesting. Do you like hotels or Airbnb?'
},
{
'role': 'user',
'content': 'I like Airbnb more.'
}
]
client.add(messages, user_id='John Doe', output_format='v1.1')
# Step 2: Create a Crew with User Memory
gemini_llm = LLM(
model='gemini/gemini-2.0-flash',
temperature=0.7,
max_tokens=1024
)
travel_planner_agent = Agent(
role='Travel Planner',
goal='Create a detailed travel itinerary',
backstory='Expert in travel logistics',
llm=gemini_llm,
verbose=True,
allow_delegation=False
)
planning_task = Task(
description=(
'Plan a weekend trip to {destination} focusing on beaches, '
'considering the user preference for Airbnb.'
),
expected_output=(
'A day-by-day itinerary in markdown format, including beach '
'and activity suggestions.'
),
agent=travel_planner_agent
)
# --- Workaround Explanation ---
#
# The Crew class currently has a bug where it doesn't automatically
# create the UserMemory component when 'provider': 'mem0' is set.
# It incorrectly looks for a 'user_memory' key in the config.
# To trigger the correct initialization path *within* UserMemory,
# we add the 'user_memory': {} key manually. This satisfies the
# faulty check in Crew, causing it to call UserMemory(crew=self),
# which then correctly uses the 'provider' and 'config' details.
crew = Crew(
agents=[travel_planner_agent],
tasks=[planning_task],
verbose=True,
process=Process.sequential,
memory=True,
memory_config={
'provider': 'mem0',
'config': {
'user_id': 'John Doe'
},
'user_memory': {} # Workaround for buggy memory initialization
},
)
result = crew.kickoff(
inputs={
'destination': 'Cancun'
}
)
print(f'\n🤖 Your Travel Plan:\n\n{result.raw}\n')