I’ve been trying to use a large context open ai model e.g. o3 but crewai seems to be mishandling the model. here’s the code:
llm = LLM(
model=“openai/o3-mini”,
temperature=0.5,
max_completion_tokens=32768
and the error…
Error: litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {‘error’: {‘message’: “Unsupported parameter: ‘max_tokens’ is not supported with this model. Use ‘max_completion_tokens’ instead.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘max_tokens’, ‘code’: ‘unsupported_parameter’}}
I’ve added max_completion_tokens and cleared the cache and tried funning again 4x…