Is the OpenAI model o3-mini working on crewai

Hello everyone,

I am working on a project using CrewAI, and I’m trying to access the o3-mini model using the OpenAI API. I’ve set OPENAI_MODEL_NAME=o3-mini in my .env file, but I keep getting the following error:

OpenAIException - Error code: 404 - {'error': {'message': 'The model `o3-mini` does not exist or you do not have access to it.'}}

However, when I switch to gpt-4 or gpt-3.5-turbo, everything works perfectly.

Could anyone share experience on this

Thanks in advance for your help!

2 Likes

I had the same issue today. I also couldn’t find anything on the website.

This works for me… I needed to remove the temperature. To get it working .

I use the following in my crew

llm = LLM(model="gpt-4o")
llm_mini = LLM(model="o3-mini")

and set the llm in the crew

		return Agent(
			config=self.agents_config['researcher'],
			verbose=True,
			allow_delegation=False,  #This will drive therest of the agents
    		max_iter=20,  
			llm=llm_mini,
		)

I am running crewai --version 0.100.0

we’re adding this in the next few hours

Save the model as MODEL=o3-mini in .env.

1 Like

Thanks for working on this and update:
I did run
crewai -update
Replaced model name OPENAI_MODEL_NAME=o3-mini )inctead of [gpt-4o] which works works well.
Restarted the Terminal but still see the error:
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True’.

ERROR:root:LiteLLM call failed: litellm.NotFoundError: OpenAIException - Error code: 404 - {‘error’: {‘message’: ‘The model o3-mini does not exist or you do not have access to it.’, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: ‘model_not_found’}}

Can you guide, anything I’m doing incorrect?

1 Like

can you first confirm you have access to the model in the openai api account?

also, change the variable name from OPENAI_MODEL_NAME=o3-mini to MODEL=o3-mini

if it’s still not working, please share a link to your codebase repo.

1 Like

same issue i’m having today…

mind sharing your code and error?

For those of you running into the error:

Unsupported parameter: 'stop' is not supported with this model

This error’s been popping up for both the o3 and o4-mini models. Give this a whirl for declaring your LLM, following this recommendation:

from crewai import LLM, Agent

o4_mini_llm = LLM(
    # [...] other LLM configuration parameters
    additional_drop_params=["stop"] # 👈
)

test_agent = Agent(
    # [...] other Agent configuration parameters
    llm=o4_mini_llm
)

I ran into this… Looks like the bug is being worked on [BUG] o3 model support (not o3-mini) · Issue #2738 · crewAIInc/crewAI · GitHub

Hey @Tony_Wood, so PR #2738 actually got closed once that native LiteLLM solution we were talking about came up. I’m just waiting on this test with the o3 model to move along, so we can see for sure if that additional_drop_params=["stop"] tweak is actually going to be enough.

Hi @Max_Moura I just tested with the .119 release and still a problem.. Do you get the same?

litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}   
An error occurred while running the flow: Command '['uv', 'run', 'kickoff']' returned non-zero exit status 1.