Using different LLM to 4-mini

Hi All,

I am playing around with the default flow setup (the poem writer). Everything works when using the default openai 4-mini as the LLM. I want to try some other llm like groq. I have added my API key (which I have verified is working) into my .env as GROQ_API_KEY=xai- and in the agent.yaml I have added llm: groq/llama-3.2-90b-text-preview (I have tried other models). When I try to run the flow I see “litellm.exceptions.BadRequestError: litellm.BadRequestError: GroqException - {“error”:{“message”:“Invalid API Key”,“type”:“invalid_request_error”,“code”:“invalid_api_key”}}”

I have also tried this with the Anthropic API with similar results.

I have made no change to the code apart from adding the API key to the .env and adding the LLM parameter to the agent.yaml

Any help would be appreciated.

Thanks

Rob

Was my error, I got my API key from x.ai not from groq… 1.5 days trying to work it out :slight_smile: