well I am back to an old topic. I have removed any use of json to try and simplify this problem. I use gpt4omini for the general llm= . I use function_calling_llm = function where function is an llm running on lm studio.
crewai never calls function. here it is defined
@llm
def function_llm(self):
return LLM(model=“openai/function”, temperature=0.1, base_url=“http://127.0.0.1:1234/v1”)
Here it is used for the agent
@agent
def researcher(self) → Agent:
return Agent(
config=self.agents_config[‘researcher’],
llm=self.gpt4omini(),
function_calling_llm=self.function_llm()
LM studio works fine as I tested it by using it for the general llm= previously.
Why is crewai not recognizing an open source function calling llm?