I’m having a problem applying the memory in my agent system, although my agent manages to return the response correctly in its output, after that I receive an error:
Failed to add to long term memory: Failed to convert text into a Pydantic model due to the following error: litellm.AuthenticationError: geminiException - {
“error”: {
“code”: 400,
“message”: “API key not valid. Please pass a valid API key.”,
“status”: “INVALID_ARGUMENT”,
“details”: [
{
“@type”: “type.googleapis.com/google.rpc.ErrorInfo”,
“reason”: “API_KEY_INVALID”,
“domain”: “googleapis.com”,
“metadata”: {
“service”: “generativelanguage.googleapis.com”
}
},
{
“@type”: “type.googleapis.com/google.rpc.LocalizedMessage”,
“locale”: “en-US”,
“message”: “API key not valid. Please pass a valid API key.”
}
]
}
}
When I switch to the Gemini key for the OpenAi one, this problem does not occur.
To use the memory I had to configure an embedder model, I chose an ollama model with 1536 embedding dimension, because without this it would give another error.
Same issue here:
Failed to add to long term memory: Failed to convert text into a Pydantic model due to error: litellm.AuthenticationError: geminiException - {
“error”: {
“code”: 400,
“message”: “API key not valid. Please pass a valid API key.”,
“status”: “INVALID_ARGUMENT”,
“details”: [
{
“@type”: “type.googleapis com/google.rpc.ErrorInfo”,
“reason”: “API_KEY_INVALID”,
“domain”: “googleapis.com”,
“metadata”: {
“service”: “generativelanguage.googleapis.com”
}
},
{
“@type”: “type.googleapis com/google.rpc.LocalizedMessage”,
“locale”: “en-US”,
“message”: “API key not valid. Please pass a valid API key.”
}
]
}
}
This situation (and other related ones) happens because CrewAI:
Uses the LiteLLM library for agent communication with LLMs.
Uses the EmbedChain library for RAG tools.
And that’s where the conflicting points arise. This becomes clear when you try to customize the RAG tools. So, according to the documentation of each library, we have to: