Hey everyone,
i set up crewai with some basic tools and agents with a local llm via the docker model runner and everything like “crewai run” works fine.
But when using crewai chat i get an error:
PS C:\Users\user\Documents\crewai\latest_ai_development> crewai chat
Starting a conversation with the Crew
Type ‘exit’ or Ctrl+C to quit.Analyzing crew and required inputs - this may take 3 to 30 seconds depending on the complexity of your crew.
.┌─────────────────────────────────────────────────────────────────────────────────────────────────────── LLM Error ───────────────────────────────────────────────────────────────────────────────────────────────────────┐
│ │
│LLM Call Failed │
│ Error: │
│ │
└─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘Traceback (most recent call last):
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\openai_client.py”, line 130, in init
raise OpenAIError(
“The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable”
)
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variableDuring handling of the above exception, another exception occurred:
Traceback (most recent call last):
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “”, line 198, in _run_module_as_main
File “”, line 88, in run_code
File "C:\Users\user.local\bin\crewai.exe_main.py", line 10, in
sys.exit(crewai())
~~~~~~^^
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\click\core.py”, line 1442, in call
return self.main(*args, **kwargs)
~~~~~~~~~^^^^^^^^^^^^^^^^^
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\click\core.py”, line 1363, in main
rv = self.invoke(ctx)
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\click\core.py”, line 1830, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\click\core.py”, line 1226, in invoke
return ctx.invoke(self.callback, **ctx.params)
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\click\core.py”, line 794, in invoke
return callback(*args, **kwargs)
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\crewai\cli\cli.py”, line 363, in chat
run_chat()
~~~~~~~~^^
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\crewai\cli\crew_chat.py”, line 81, in run_chat
crew_chat_inputs = generate_crew_chat_inputs(crew, crew_name, chat_llm)
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\crewai\cli\crew_chat.py”, line 387, in generate_crew_chat_inputs
description = generate_input_description_with_ai(input_name, crew, chat_llm)
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\crewai\cli\crew_chat.py”, line 481, in generate_input_description_with_ai
response = chat_llm.call(messages=[{“role”: “user”, “content”: prompt}])
File “C:\Users\user\AppData\Roaming\uv\tools\crewai\Lib\site-packages\crewai\llm.py”, line 977, in call
return self._handle_non_streaming_response(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
params, callbacks, available_functions, from_task, from_agent
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
i follwed the docs by putting chat_llm=“openai/ai/gemma3-qat:latest” in my @crew, but it wont work
thanks for the help in advance