Recommendations for running custom tools with local ollama models (having function calling capabilities)

Indeed the qwen2.5:14b-instruct is able to invoke my custom tools.
Here is my LLM initialisation that was used by the agent -

qwen_llm = LLM(
    model="ollama_chat/qwen2.5:14b-instruct",
    base_url="http://localhost:11434"
)