Crewai Chat litellm error

When I am using crewai chat feature I get the following error, I have the openai api key in a .env file but do not understand why I am getting the following error, any help is highly appreciated, Crewai run is working fine but Crewai Chat does not work

Crewai chat

Starting a conversation with the Crew
Type ‘exit’ or Ctrl+C to quit.

Analyzing crew and required inputs - this may take 3 to 30 seconds depending on the complexity of your crew.
.

LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True’.

ERROR:root:LiteLLM call failed: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Hello there,

I did go through your issue.

It looks like the issue might be rising because you have not set the API_KEY correctly.

You did mention you have placed the API_KEY in the “.env” file, but it is then necessary to call this API_KEY using the “os” module within the crew file or set the variable within the CLI terminal. Make sure you have looked into these too.

Hope you got a better clarity.

Thank you for the response, I have the solution working when I use the crewai run command no error message no problem there, but when I use Crewai chat, I get this error, which is puzzling.

This topic was automatically closed after 30 days. New replies are no longer allowed.