Ollama is dying with latest module update

I just updated to crewai today (last was the weekend). CrewAI is now failing on Ollama.

 Error during LLM call: litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
  File "PycharmProjects/myproject/.venv/lib/python3.11/site-packages/litellm/main.py", line 2870, in completion
    response = base_llm_http_handler.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "PycharmProjects/myproject/.venv/lib/python3.11/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 270, in completion
    data = provider_config.transform_request(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "PycharmProjects/myproject/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 322, in transform_request
    modified_prompt = ollama_pt(model=model, messages=messages)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "PycharmProjects/myproject/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 229, in ollama_pt
    tool_calls = messages[msg_i].get("tool_calls")
                 ~~~~~~~~^^^^^^^
IndexError: list index out of range

I ran litellm debug and my messages are not empty.

I was testing the new Qwen3 for RAG pre-processing and ran into the same error too.

Fixed it by manually installing version 0.6.7-rc0.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.