@Paarttipaabhalaji If you encounter this error, follow these two steps:
- Use the CrewAI
LLM
class, which leverages LiteLLM in the background. - Make sure to set the LLM provider before configuring the LLM. For watsonx.ai, use
watsonx/<LLM provider>/<LLM name>
. If you’re unsure how to do this for a specific LLM provider, refer to the LiteLLM Providers page for guidance.
from crewai import Agent, LLM
my_llm = LLM(
api_key=os.getenv("WATSONX_API_KEY"),
model="watsonx/meta-llama/llama-3-8b-instruct",
)
my_agent = Agent(
...,
llm=my_llm,
)