Use OpenRouter as default LLM from Flow

Hi
I can’t seem to find documentation on how to chage the default llm to OpenRouter projectwise. Tried the llama tutorials, but I keep getting errors like this:

ERROR:root:Failed to get supported params: argument of type ‘NoneType’ is not iterable

ERROR:root:LiteLLM call failed: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

my .env has:

OPENAI_API_BASE= Model Not Found | OpenRouter
OPENAI_MODEL_NAME= mistralai/mistral-7b-instruct
OPENROUTER_API_KEY = sk-or-v1-8(and some more numbers)0

1 Like

In case someone comes across the same problem, the solution was:

Put the openrouter key under the OPENAI_API_KEY, and append “openrouter/” to the model name. Like this:

OPENAI_API_BASE= Model Not Found | OpenRouter
OPENAI_MODEL_NAME= openrouter/mistralai/mistral-7b-instruct
OPENAI_API_KEY = (the openrouter api key)

1 Like