Is gpt-5.1-codex supported in CrewAI yet?

Hi everyone,

I’m currently trying to use the gpt-5.1-codex model with CrewAI, but I keep running into the following error:

Model gpt-5.1-codex not found: Error code: 404
This is not a chat model and thus not supported in the v1/chat/completions endpoint.
Did you mean to use v1/completions?

Thanks in advance for any guidance :raising_hands:

Doesn’t CrewAi use LiteLLM as the provider. You got a 404 error, and I haven’t looked at the codex endpoint, it seems like you’ve got it misconfigured. You could run an litellm proxy as a short term work around if there was something hard coded that would make it not work.

1 Like

Thanks @michaelcizmar . Will check take a look.

Hey @michaelcizmar , Do you have any documentation for configure liteLLM proxy?

There’s a lot of documentation about the LiteLLM proxy online. Howerver, the gpt-5.1-codex is not a traditional LLM where you provide it a prompt and get a response. Sorry for the potential confusion on that. The Codex model has an SDK and an MCP server which might be more appropriate for what you are trying to do. Have you looked into those? Use Codex with the Agents SDK

Nope. We have already CrewAI setup so that’s why I’m looking for this.