CrewAI not Working with Ollama

I am trying to follow the steps that are mentioned here:

but somehow its not able to work with ollama. I have already tried different python versions, different ollama models and different environments like pip and uv but still i am getting the same error message and i am not able to fix it.

raise Exception(f"An error occurred while running the crew: {e}")

Exception: An error occurred while running the crew: Fallback to LiteLLM is not available

An error occurred while running the crew: Command ‘[‘uv’, ‘run’, ‘run_crew’]’ returned non-zero exit status 1.

I am getting the same error if I run simply this:

from crewai import LLM

llm = LLM(
model=“ollama/llama3.1”,
base_url=“http://localhost:11434
)

There’s been a change with respect to LiteLLM as noted in this issue.

You probably need to add it explicitly like:

uv add litellm
uv sync
uv run run_crew

and you can directly test it being available via:

uv run python -c "import litellm; print('litellm ok')"
2 Likes

Thanks alot, with this the issue is resolved.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.