litellm.exceptions.BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=sabia-3

Hello Crew!

I’m trying to use Maricata’s sabia-3 model, but keep getting the 'LLM Provider NOT provided.
I can use Maritaca’s model using openai API, so I think that wouldn’t be a problem to use with crewai.

.env file:

OPENAI_API_KEY="***"
OPENAI_API_BASE="https://chat.maritaca.ai/api/"
OPENAI_MODEL_NAME="sabia-3"

What am I missing?

I was getting the same error.
I used this docs: LiteLLM - Getting Started | liteLLM
First you have to check if the provider exists and that they provide that model. If they provide, you should indicated this way:
OPENAI_MODEL_NAME=“openai/gpt-3.5-turbo”

Start with the provider’s name

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.