I am having issues using other LLMs other than the default OpenAI models with my crews. I am new to all this and have been following the instructions from the official documentation: Quickstart - CrewAI and also the LiteLLM docs to see what parameters I need to parse to the LLM instance.
My issue is that when I run the python command “python src/crewai/main.py” it doesn’t do anything, not even an error gets thrown. I have run it in verbose as well and no error messages in the logs.
I have changed to Ollama and I can interact with it manually, but when I run it as part of my crew it does the same thing as the cohere LLM.
Not sure if it’s my machine. I’m using an old MacBook Air. I am using crewai version 0.86.0 and using python version 3.11.9.
I initiate my crew by running the crewai create crew crewai_firstcrew
The resulting files, I update the crew.py file as follows (see bold text):
from crewai import Agent, Crew, Process, Task, LLM
from crewai.project import CrewBase, agent, crew, task
from dotenv import load_dotenv
load_dotenv()
@CrewBase
class CrewaiFirstcrew():
“”“CrewaiFirstcrew crew”“”
agents_config = 'config/agents.yaml'
tasks_config = 'config/tasks.yaml'
**ollama_llm = LLM(**
** model=“ollama/llama3.2:3b”,**
** api_base=“http://localhost:11434”**
** )**
@agent
def researcher(self) -> Agent:
return Agent(
config=self.agents_config['researcher'],
verbose=True,
**llm=self.ollama_llm**
)
Hoping someone is able to give some direction as to how to get this working or point out any misconfigurations I might have.