Hi there, what if I am using local LLM such as:
llm_deepseek = Ollama(
model="deepseek-r1:1.5b",
base_url="http://localhost:11434",
temperature=0.7,
)
… and got the error when addressing the llm to the agent:
ERROR:root:Failed to get supported params: argument of type 'NoneType' is not iterable
Do you have more details?
I also have problems. It worked only once, and later gave me an error.
This this:
from crewai import Agent, Crew, Process, Task, LLM
…
agent_1 = Agent(
role="Trump",
goal="Analyze America, make it great again.",
backstory="""
You are an expert in everything, like a lot
""",
tools=[MyCustomDuckDuckGoTool(), ScrapeWebsiteTool()],
👉🏻 llm=LLM(model="ollama/deepseek-r1:7b", temperature=0.3),
l
Is this working? It never worked for me.
Are you using a single file crew (MyAwesomeCrew.py)?
OR
Are you using a multi-file setup (peomcrew.py, main.py, agents.yaml, tasks.yaml, etc.)?
I am now integrating the LLM into the agent, and it is like this:
sales_rep_agent = Agent(
role="Sales Representative",
goal="Identify high-value leads that match "
"our ideal customer profile",
backstory=(
"As a part of the dynamic sales team at CrewAI, "
"your mission is to scour "
"the digital landscape for potential leads. "
"Armed with cutting-edge tools "
"and a strategic mindset, you analyze data, "
"trends, and interactions to "
"unearth opportunities that others might overlook. "
"Your work is crucial in paving the way "
"for meaningful engagements and driving the company's growth."
),
allow_delegation=False,
verbose=True,
llm=LLM(model="ollama/deepseek-r1:1.5b", base_url="http://localhost:11434")
)
However, the output is not as accurate as I would like, likely because I am using the smallest available model.