Chatollama not working with CrewAi

I am using this below code
llm = ChatOllama(
base_url=“http://localhost:11434”,
model=“qwen:7b-chat”, # or “llama3.2:latest”
temperature=0.7
)

Pass the real LLM object directly

city_selection_agent = Agent(
role=‘City Selection Expert’,
goal=‘Select the best city based on weather, season, and prices’,
backstory=‘An expert in analyzing travel data to pick ideal destinations.’,
tools=[SerperDevTool(), SeleniumScrapingTool()],
verbose=True,
llm=llm # :white_check_mark: THIS MUST BE A ChatOllama object, not a string

Hope agent llm class is coming
agents.llm : <class ‘langchain_community.chat_models.ollama.ChatOllama’>
and crew.ai version : 0.130.0 is coming in my project.

But I am getting below error:
raise e
result = original_function(*args, **kwargs)
raise exception_type(
model, custom_llm_provider, dynamic_api_key, api_base = get_llm_provider(
raise e

.BadRequestError:
)