2024-10-20 10:13:45,927 - 6216495104 - llm.py-llm:161 - ERROR: LiteLLM call failed: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=model=‘claude-3-haiku-20240307’ anthropic_api_url=‘https://api.anthropic.com’ anthropic_api_key=SecretStr(‘**********’) model_kwargs={}
Pass model as E.g. For ‘Huggingface’ inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: Providers | liteLLM
from langchain_anthropic import ChatAnthropic
self.llm = ChatAnthropic(
model=“claude-3-haiku-20240307”
)
@bhavya_giri If you encounter this error, follow these two steps:
Use the CrewAI LLM class, which leverages LiteLLM in the background.
Make sure to set the LLM provider before configuring the LLM. For Anthropic, use anthropic/<LLM name>. If you’re unsure how to do this for a specific LLM provider, refer to the LiteLLM Providers page for guidance.