Hi CrewAI community,
I recently released memweave, an open-source Python library for AI agent memory, and wanted to share it here and discuss how it could integrate with CrewAI agents.
What it does:
-
Stores agent memories as plain Markdown files — readable, editable,
git diff-able -
Hybrid search: BM25 keyword + vector (sqlite-vec) combined
-
Zero servers, zero setup — single SQLite file on disk
-
Zero LLM calls on write/search operations
-
Works fully offline — falls back to keyword search if embedding API is down
-
Pluggable search strategies and post-processors
Quick example:
async with MemWeave(MemoryConfig(workspace_dir=".")) as mem:
await mem.index()
results = await mem.search("user preferences", min_score=0.0)
Why it’s relevant for CrewAI:
CrewAI agents could use memweave as a persistent memory backend — each agent writing to its own namespace (memory/agents/researcher/) while sharing a common workspace, with the coordinator searching across all agents’ memories.
Looking for:
-
Feedback from the community on the approach
-
Thoughts from the CrewAI team on potential integration patterns
-
Anyone already building multi-agent memory systems who wants to collaborate
Happy to discuss and answer any questions!