I am trying to follow the steps that are mentioned here:
but somehow its not able to work with ollama. I have already tried different python versions, different ollama models and different environments like pip and uv but still i am getting the same error message and i am not able to fix it.
raise Exception(f"An error occurred while running the crew: {e}")
Exception: An error occurred while running the crew: Fallback to LiteLLM is not available
An error occurred while running the crew: Command ‘[‘uv’, ‘run’, ‘run_crew’]’ returned non-zero exit status 1.