researcher = Agent(
role=‘{topic} Senior Data Researcher’,
goal= ‘Uncover cutting-edge developments in {topic}’,
backstory=(
“”“You’re a seasoned researcher with a knack for uncovering the latest
developments in {topic}. Known for your ability to find the most relevant
information and present it in a clear and concise manner.”“”
),
tools=[get_research],
llm=llm
)
Error :
BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=my-model
Pass model as E.g. For ‘Huggingface’ inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: Providers | liteLLM
@Yash The error occurs because of how you set your LLM. Using the CrewAI LLM class, which uses LiteLLM in the background, you also need to provide the LLM provider.
For example, if you use the Anthropic Claude 3.5 Sonnet LLM, you need to put anthropic/ before setting the LLM.