Hi
I can’t seem to find documentation on how to chage the default llm to OpenRouter projectwise. Tried the llama tutorials, but I keep getting errors like this:
ERROR:root:Failed to get supported params: argument of type ‘NoneType’ is not iterable
ERROR:root:LiteLLM call failed: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
my .env has:
OPENAI_API_BASE= Model Not Found | OpenRouter
OPENAI_MODEL_NAME= mistralai/mistral-7b-instruct
OPENROUTER_API_KEY = sk-or-v1-8(and some more numbers)0