To get this working, I have set the following environment variables:
AICORE_CLIENT_ID
AICORE_CLIENT_SECRET
AICORE_BASE_URL
However, when I run the code, I get an error stating that the OpenAI API key is missing. I do not have any API key since I do not have access to the OpenAI API directly.
Error: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Is there a way to avoid this error while using the proxy client, or am I possibly missing a configuration step? Any guidance would be greatly appreciated!
Hi,
I ran into the same issue. While I wasn’t able to fully resolve the root cause - namely, that CrewAI expects an OpenAI API key even when you want to use a custom LLM - I did manage to find a workaround.
CrewAI defines an abstract class called BaseLLM, which outlines the methods required for any (custom) llm integration. To use your own llm, you only need to implement the call function, ensuring it accepts the proper arguments and returns the model’s response. Here’s a minimal example: