LiteLLM console messages when not using LiteLLM

Using latest versions of crewAI, duckduckgo_search, ollama (local) and langchain. Would like to understand how to resolve or remove/remediate this output.

Getting output in the console of the following:

Give Feedback / Get Help: Sign in to GitHub · GitHub
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True’.

Provider List: Providers | liteLLM

Added feedback from further testing:

This was tied to the LLM on the task and agent event if not used. After commenting out local LLM (ollama) from those tasks and agents, the only remaining error I get for the LLM on my local Ollama instance (which worked in 0.83.0) is this text shown in the agent console output:

Give Feedback / Get Help: Sign in to GitHub · GitHub
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True’.

Provider List: Providers | liteLLM

Is this because I’m using local LLM instead of an external service or other thoughts? I have no configuration for BerriAI or any other AI service, only the local Ollama instance.

1 Like