Can't Use Opensource model for Function

Edit: You can read the following, but I discovered the issue is just that crewai can’t do the json conversion I specified in my output formats when using open source models. I even used a large one like a q6 quant of command R+. If you just remove the json as output format, then use the exact same schema that is in the pydantic class, the open source model handles it just fine. I dont understand why this is and clearly the error message about the model “chat” not existing is nonsense.

Original Post:

I use only 1 open source model in LM Studio named “chat”. The first agent uses “chat” to use a tool to look up 3 articles on pubmed. This is done correctly and the first agent saves the json preliminary report on the 3 articles. The report agent ( 2nd agent) is supposed to reformat the 3 articles and save a final report file. I get this. ( oh and by the way I do not get this error when I use gpt4omini for the llm).

for data in context_data:
report.add_entry(data[“title”], data[“authors”], data[“publication_date”], data[“abstract”], chemicals=data[“chemicals”], url=data[“url”], doi=data[“doi”])

Generate the final report

final_report = report.generate_report()
print(final_report)



then I get

Failed to convert text into JSON, error: litellm.NotFoundError: OpenAIException - Error code: 404 - {'error': {'message': 'The model `chat` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}. Using raw output instead.

This is clearly wrong as  the model "chat" does exist. It is what got the 3 articles in the 1st place. Here is the way I create the model

@llm
    def chat(self):
        return LLM(model="openai/chat", temperature=0.1, top_p=0.2, base_url="http://127.0.0.1:1234/v1")


and here it is spec'd in the agent 1 and agent 2

@agent
    def researcher(self) -> Agent:
        return Agent(
            config=self.agents_config['researcher'],
            llm=self.chat()
        )

    @agent
    def reporting_analyst(self) -> Agent:
        return Agent(
            config=self.agents_config['reporting_analyst'],
            llm=self.chat()
        )


These are identical. Any thoughts much appreciated!