Hello, I’m struggling to go through Azure Openai while crewai insists in calling openai. Is there a way to avoid this ? I didn’t find any useful info on this. Thanks for the support
Run into the same problem. The solution is in setting the environment variables LiteLLM expects: see here Azure OpenAI | liteLLM. Then define the agent using LLM(model='azure/<model-name>')
My example looks as follows
# api key is an environment variable
AZURE_OPENAI_API_KEY = os.getenv('AZURE_OPENAI_API_KEY')
# configuration are stored in a config dict
API_VERSION = config['api-version']
AZURE_OPENAI_ENDPOINT = config['openai-endpoint']
AZURE_OPENAI_LLM_DEPLOYMENT_NAME = config['llm-deployment-name']
LLM_MODEL_NAME = config['llm-model-name']
# set environment as LiteLLM expects
os.environ['AZURE_API_KEY'] = AZURE_OPENAI_API_KEY
os.environ["AZURE_API_BASE"] = AZURE_OPENAI_ENDPOINT
os.environ["AZURE_API_VERSION"] = API_VERSION
# define the agent
from crewai import Agent, LLM
agent = Agent(
role='...',
goal='...',
... ,
llm=LLM(model=f'azure/{LLM_MODEL_NAME}')
)
Thank you ! I will try your suggestion