I encountered a similar problem using Ollama 0.6.8,Crewai 0.119.0,model is gwen3:8b
I checked the bugged part in factory.py and found that the “assistant” messages are handled differently with the “user” and “system” messages -
This is the “user” part:
while msg_i < len(messages) and messages[msg_i]["role"] in user_message_types: msg_content = messages[msg_i].get("content") if msg_content: if isinstance(msg_content, list): for m in msg_content: if m.get("type", "") == "image_url": if isinstance(m["image_url"], str): images.append(m["image_url"]) elif isinstance(m["image_url"], dict): images.append(m["image_url"]["url"]) elif m.get("type", "") == "text": user_content_str += m["text"] else: # Tool message content will always be a string user_content_str += msg_content msg_i += 1
Notice where the msg_i +=1 is located at;and this is the “assistant” part:
while msg_i < len(messages) and messages[msg_i]["role"] == "assistant": assistant_content_str += convert_content_list_to_str(messages[msg_i]) msg_i += 1 tool_calls = messages[msg_i].get("tool_calls") ollama_tool_calls = [] if tool_calls: for call in tool_calls: call_id: str = call["id"] function_name: str = call["function"]["name"] arguments = json.loads(call["function"]["arguments"]) ollama_tool_calls.append( { "id": call_id, "type": "function", "function": { "name": function_name, "arguments": arguments, }, } ) if ollama_tool_calls: assistant_content_str += ( f"Tool Calls: {json.dumps(ollama_tool_calls, indent=2)}" ) msg_i += 1
So I changed the “assistant” part so that it aligns with the “user” part:
while msg_i < len(messages) and messages[msg_i]["role"] == "assistant": assistant_content_str += convert_content_list_to_str(messages[msg_i]) # msg_i += 1 tool_calls = messages[msg_i].get("tool_calls") ollama_tool_calls = [] ... if ollama_tool_calls: assistant_content_str += ( f"Tool Calls: {json.dumps(ollama_tool_calls, indent=2)}" ) msg_i += 1
And it worked well then on.This seems to be a bug within the factory.py file,which is a litellm bug to be solved.