I am new to CrewAI, and I would like to know if setting the API key as an environment variable is mandatory. I am using OpenAI models through a proxy, which requires a client ID and secret. I am generating an LLM instance and passing it to the agents.
However, when I run the app, I encounter the following error:
OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
This is my implementation:
from gen_ai_hub.proxy.langchain.openai import ChatOpenAI
from gen_ai_hub.proxy.core.proxy_clients import get_proxy_client
The problem is that I am accessing the model through a proxy client provided by my company, so I do not have direct access to the API key. I am only using my credentials and specifying the model I want.
For that I am setting this env variables: AICORE_CLIENT_ID, AICORE_CLIENT_SECRET, AICORE_BASE_URL
My question is: can I somehow avoid setting the OpenAI API key, since I am not using OpenAI directly?
My suggestion would be to get this handshake between the proxy and OpenAi to work in a very small test code before trying to integrate with Crew. I use a small test code that passes the keys and then I just ask the LLM to tell me a joke. Then you know it is all working fine, and you have a new joke to tell. Good luck!
I am already using this in a larger framework, where one agent calls other agents through custom tools. All agents are using the same LLM through the proxy, so it works without CrewAI. However, CrewAI expects the API key, which I unfortunately do not have.