Enablement of WatsonxLLM as the provider in crewai project setup

@Paarttipaabhalaji If you encounter this error, follow these two steps:

  1. Use the CrewAI LLM class, which leverages LiteLLM in the background.
  2. Make sure to set the LLM provider before configuring the LLM. For watsonx.ai, use watsonx/<LLM provider>/<LLM name>. If you’re unsure how to do this for a specific LLM provider, refer to the LiteLLM Providers page for guidance.
from crewai import Agent, LLM

my_llm = LLM(
    api_key=os.getenv("WATSONX_API_KEY"),
    model="watsonx/meta-llama/llama-3-8b-instruct",
)

my_agent = Agent(
    ...,
    llm=my_llm,
)
2 Likes