I am getting this litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - litellm.Timeout: Connection timed out after 600.0 seconds. And how do I resolve it. It does not look like the solution provided by cursor works at all.
Is the model you want to use loaded? It seems like you might be getting that error because litellm can’t establish a connection with the ollama model.
Just a guess here. Try ollama run your_model
first and make sure it’s working. Then try running your agent again.
How complex is your project and what are you trying to do? Aslo what crew AI features do you need. Litellm was introduced in 0.6x of crew. So if you dont need all the new bells and whistles mayby downgrade? Also like the other comment I would look at making sure your endpoint is in good status and has the model avaliable.