Ollama stopped working since update to CrewAI 0.60.0

With ollama you can still use Litellm. You just need to change the base URL via the env variable for example

api_base=“http://localhost:11434

1 Like