litellm.BadRequestError: OpenAIException - Unsupported parameter: 'stop' is not supported with this model

When i try to run my crew, i get this error.

litellm.BadRequestError: OpenAIException - Unsupported parameter: ‘stop’ is not supported with this model.

I’m using the o3 model from the OpenAPI Provider
and i have defined LLM model in the .env file

MODEL=o3-2025-04-16

There are some other ways i have tried with the existing solution provided by the other users
commenting the "stop": self.stop, in the crewai/llm.py folder.

This kinda Monkey patching is not helpful when we move our agents into production.

Kindly look into this issue and resolve it.