The error occurs because the LLM
class in the crewai
package expects a BASE_URL
key for Ollama’s base URL, but the ENV_VARS
in constants.py
currently provides API_BASE
instead. This mismatch is likely due to a recent update in the LLM
class, which changed the expected key from API_BASE
to BASE_URL
.
Cause:
In the file crewai/cli/constants.py
, line 49, the ollama
configuration specifies an API_BASE
key for the base URL. However, the LLM
class being initialized in agent.py
(line 195) expects BASE_URL
instead. This causes a TypeError
when LLM
is instantiated with the incorrect keyword argument.
Temporary Workaround:
To fix this without modifying the constants.py
file (which might be overridden after an update), you can dynamically override the key in your project by adding the following code in your crew.py
file before the @CrewBase
class definition:
from crewai.cli.constants import ENV_VARS
# Override the key name dynamically
for entry in ENV_VARS.get("ollama", []):
if "API_BASE" in entry:
entry["BASE_URL"] = entry.pop("API_BASE") # Rename the key
Let me know if it works