Crew AI memory and chatbot

Hey all.

I have a chat app. I’m using crews to respond to user inquiries in the chat.

I would like crews to provide answers and rely on a context from a previous questions.

My expectation is that Memory feature can help with that.

I configured my crew in the following way:

dynamic_crew = Crew(
                agents=crew_agents,
                tasks=crew_tasks,
                verbose=True,
                memory=True,
                embedder={
                    "provider": "google",
                    "config": {
                        "api_key": os.getenv("GEMINI_API_KEY"),
                        "model": "models/text-embedding-004"
                    }
                },
                long_term_memory=LongTermMemory(storage=LTMSQLiteStorage(
                    db_path=
                    "./memorydb/dynamic_crew_%s/long_term_memory_storage.db" %
                    user_id)),
                short_term_memory=ShortTermMemory(storage=RAGStorage(
                    type="short_term",
                    embedder_config={
                        "provider": "google",
                        "config": {
                            "api_key": os.getenv("GEMINI_API_KEY"),
                            "model": "models/text-embedding-004"
                        }
                    },
                    path="./memorydb/dynamic_crew_%s/" % user_id)),
                entity_memory=EntityMemory(storage=RAGStorage(
                    type="short_term",
                    embedder_config={
                        "provider": "google",
                        "config": {
                            "api_key": os.getenv("GEMINI_API_KEY"),
                            "model": "models/text-embedding-004"
                        }
                    },
                    path="./memorydb/dynamic_crew_%s/" % user_id)),
            )

As you see for each user the memory is stored in a different folder.
I see the files are stored in the folder and memories are written, but at the same time I don’t see context being used.

Like I ask a simple question: How old is Nike brand? - got a response.
The next question is: “What about Columbia?” - the crew response is not about the age at all.

Are my expectations wrong? I saw this article from @zinyando that makes sense for my use case, but I’m curious if I can have built-in memory to work for this use case.

1 Like

I am also interested in this use case

This is a bit difficult to achieve with just built in memory, you need a combination of short-term memory (conversation history) and long term memories to have a proper chatbot that understand the context of the conversation.

This this thread is a bit old, did you manage to achieve what you wanted?

1 Like

@zinyando frankly no. I had multiple attempts, but memory is still not working as I would expect.

I just want to have some clarity about memory in CrewAI. What are the intended use cases if not the context enrichment from previous requests.

I’m not sure what the level of solution you work on, but there are a couple of techs for mids and enterprise grade providing long memory in specific cases - one works like a cognitive memory within the chat, another like business intent memory in specific product context.

@Serge_B yeah, I get it. like mem0 or smth.
I can use those.

But I’m still interested what is the goal of memory feature in CrewAI. Is it not working as intended or I’m trying to misuse it?