I’ve been working on a small project which needs multiple LLM calls. I’ve been using Together API but credit runs out within a day and I want a solution for this.
I have tried OpenRouter with deepseek-r1 from CrewAi docs, does not work. Hugging face the same way, no luck. I’ve been trying to use LLM Studio for local llm, that too is not working.
Please lmk if any of you’ve any solution.
This is how I’ve been using llm to provide llm=llm in agent definition.
lm = LLM(
model
="openrouter/deepseek/deepseek-r1",
base_url
="https://openrouter.ai/api/v1",
api_key
=os.environ.get("OPENROUTER_API_KEY")
)
llm = LLM(
model
="google/gemma-3-12b",
base_url
="http://127.0.0.1:1234"
)
I’ve been getting below error:
An unknown error occurred. Please check the details below.
Traceback (most recent call last):
File "/Users/dummyuser/crewai2/hugging_face_trial_local.py", line 57, in <module>
result = crew.kickoff(inputs={"user_input": user_input})
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/crew.py", line 659, in kickoff
result = self._run_sequential_process()
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/crew.py", line 768, in _run_sequential_process
return self._execute_tasks(self.tasks)
~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/crew.py", line 871, in _execute_tasks
task_output = task.execute_sync(
agent=agent_to_use,
context=context,
tools=cast(List[BaseTool], tools_for_task),
)
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/task.py", line 351, in execute_sync
return self._execute_core(agent, context, tools)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/task.py", line 499, in _execute_core
raise e # Re-raise the exception after emitting the event
^^^^^^^
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/task.py", line 415, in _execute_core
result = agent.execute_task(
task=self,
context=context,
tools=tools,
)
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agent.py", line 435, in execute_task
raise e
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agent.py", line 411, in execute_task
result = self._execute_without_timeout(task_prompt, task)
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agent.py", line 507, in _execute_without_timeout
return self.agent_executor.invoke(
~~~~~~~~~~~~~~~~~~~~~~~~~~^
{
^
...<4 lines>...
}
^
)["output"]
^
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 121, in invoke
raise e
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 110, in invoke
formatted_answer = self._invoke_loop()
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 206, in _invoke_loop
raise e
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/agents/crew_agent_executor.py", line 153, in _invoke_loop
answer = get_llm_response(
llm=self.llm,
...<2 lines>...
printer=self._printer,
)
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/utilities/agent_utils.py", line 160, in get_llm_response
raise e
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/utilities/agent_utils.py", line 151, in get_llm_response
answer = llm.call(
messages,
callbacks=callbacks,
)
File "/Users/dummyuser/crewai2/venv/lib/python3.13/site-packages/crewai/llm.py", line 956, in call
return self._handle_non_streaming_response(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
params, callbacks, available_functions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^