This simply works without any issues.
Found a way to fix this. Use this approach. You can use a random/dummy API key for open ai API key env variable and then use the ChatOpenAI class instead of LLM class. It worked fine for me in local with the Ollama model
os.environ[“OPENAI_API_KEY”] = “testapikey”
llm_local = ChatOpenAI(
model=“ollama/deepseek-r1:latest”,
base_url=“http://localhost:11434”
)