litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call

2024-10-20 10:13:45,927 - 6216495104 - llm.py-llm:161 - ERROR: LiteLLM call failed: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=model=‘claude-3-haiku-20240307’ anthropic_api_url=‘https://api.anthropic.com’ anthropic_api_key=SecretStr(‘**********’) model_kwargs={}
Pass model as E.g. For ‘Huggingface’ inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: Providers | liteLLM

from langchain_anthropic import ChatAnthropic
self.llm = ChatAnthropic(
model=“claude-3-haiku-20240307”
)

@bhavya_giri If you encounter this error, follow these two steps:

  1. Use the CrewAI LLM class, which leverages LiteLLM in the background.
  2. Make sure to set the LLM provider before configuring the LLM. For Anthropic, use anthropic/<LLM name>. If you’re unsure how to do this for a specific LLM provider, refer to the LiteLLM Providers page for guidance.
from crewai import Agent, LLM

my_llm = LLM(
    api_key=os.getenv("ANTHROPIC_API_KEY"),
    model="anthropic/claude-3-5-sonnet-20240620",
)

my_agent = Agent(
    ...,
    llm=my_llm,
)
2 Likes

This topic was automatically closed after 30 days. New replies are no longer allowed.