Crew_Base not find LLM

I am having this issue in the new version of CrewAI.
I never had this issue in the later version of some of the tutorials.


error:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/ru/Documents/personal_agents/src/press_office/main.py", line 24, in run
    PressOfficeCrew().crew().kickoff(inputs=inputs)
  File "/home/ru/Documents/personal_agents/.venv/lib/python3.10/site-packages/crewai/project/crew_base.py", line 34, in __init__
    self.map_all_agent_variables()
  File "/home/ru/Documents/personal_agents/.venv/lib/python3.10/site-packages/crewai/project/crew_base.py", line 73, in map_all_agent_variables
    self._map_agent_variables(
  File "/home/ru/Documents/personal_agents/.venv/lib/python3.10/site-packages/crewai/project/crew_base.py", line 93, in _map_agent_variables
    if llm := agent_info.get("llm"):
AttributeError: 'NoneType' object has no attribute 'get'

Any chance you can tell me how I am calling the Agent wrong? My yaml agent file is sometimes also not parsed correctly.
I am wondering of some biog change happened and i am just using the wrong ways

Please let me know, I want to keep leaning and using this. Its a great tool

Many greetings

Others seem to having siilar issues: @moto can you assist?

What happens when you just use llm=llm in the agent def?

Nothing, exact same behavior.

Hi @Ruben_Casillas,
Let me see if I can help :hand_with_index_finger_and_thumb_crossed:

It ‘smells’ like an issue that people have been having with many LLM’s, local & remote since CrewAI started to use lightLLM . Check that link and have a ‘quick’ read of the README, list of supported models, etc.
I’ll keep looking to see if connecting to local models require a config change, or a change in method of how you connect.
I’ll update here.

This thread discusses the same issue & has a solution.

Best of luck, come back to me if you still have issues

I tried the solution but had no effect.
I use a local Ollama with Llama3.1 and worked perfectly before update.
The Llama models are supported in LiteLLM, and I changed the name to api_base instead of base_url.
Nothing changed for me.

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/ru/Documents/personal_agents/src/press_office/main.py", line 24, in run
    PressOfficeCrew().crew().kickoff(inputs=inputs)
  File "/home/ru/Documents/personal_agents/.venv/lib/python3.10/site-packages/crewai/project/crew_base.py", line 34, in __init__
    self.map_all_agent_variables()
  File "/home/ru/Documents/personal_agents/.venv/lib/python3.10/site-packages/crewai/project/crew_base.py", line 73, in map_all_agent_variables
    self._map_agent_variables(
  File "/home/ru/Documents/personal_agents/.venv/lib/python3.10/site-packages/crewai/project/crew_base.py", line 93, in _map_agent_variables
    if llm := agent_info.get("llm"):
AttributeError: 'NoneType' object has no attribute 'get'

Another thread with the same/similar issue:

Let’s try again :hand_with_index_finger_and_thumb_crossed:

Perhaps we are trying in the wrong place, the behavior is the same.
:frowning:
any other idea? Looking forward to try anything

I’ll see who is bout as I’m new to CrewAI myself (3 months) and see if I can get you some help.

UPDATE: meythod ‘get’! are you using function calling on the LLM?
Some still have this issue with function calling on local ollama.

2 mins

I am willing to put some hours as well, I am also new in the scene but would like to dive in.
If I can help, let me know

1 Like


no, I also did not understand why.
I though it was more related to the API

I think this is one that @matt will need to look at for you.

FYI: In that code you have llm assigned twice! maybe change one to llmC for chat.

I’ll keep asking, looking for you. I know how frustrating it can be just sitting there looking at your code when it’s not workingg & you have no answers, etc.

We’ll get there!

@Ruben_Casillas Can you do me a favor and zip up your code and email it to me? matt@crewai.com