Azure O1 model causes crew.kickoff() error: BadRequestError: litellm.BadRequestError: AzureException - Error code: 400 - “Unsupported value: ‘messages[0].role’ does not support ‘system’ with this model.”

Hi,
I am using crewai version 0.65.2 and wanted to test the o1 model.
Using the Azure AI Studio I deployed an o1 model called:
o1-preview-standard
And also deployed a gpt-4o model called:
gpt-4o-standard

Here is a complete example:

from __future__ import annotations

#  .env variables setting:
# AZURE_API_BASE=https://<my_endpoint>.openai.azure.com/
# AZURE_API_KEY=<my_key>

from crewai import Agent, Crew, Task
from crewai_tools import CodeInterpreterTool

code_interpreter_tool = CodeInterpreterTool()


def main():
    agent = Agent(
        role="Best Mathematician Agent",
        goal="Calculate the Fibonacci number.",
        backstory="""
        You are an expert mathematician.
        You are trying to calculate the Fibonacci number.
        """,
        tools=[code_interpreter_tool],
        verbose=True,
        llm="azure/o1-preview-standard",
    )

    task = Task(
        name="Calculate Fibonacci Number",
        description="""
        Calculate the Fibonacci number for n={n} by generating Python code for it and running it with your tools.
        """,
        expected_output="The Fibonacci number integer value.",
        agent=agent,
    )

    crew = Crew(
        tasks=[task],
        agents=[agent],
        verbose=True,
    )

    crew_result = crew.kickoff(
        inputs={
            "n": "15",
        }
    )

    print(crew_result.raw)


if __name__ == "__main__":
    main()

When running it I received the following error:

Exception has occurred: BadRequestError
litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
httpx.HTTPStatusError: Client error '400 model_error' for url 'https://<my_endpoint>.openai.azure.com//openai/deployments/o1-preview-standard/chat/completions?api-version=2024-08-01-preview'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

litellm.llms.AzureOpenAI.azure.AzureOpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

httpx.HTTPStatusError: Client error '400 model_error' for url 'https://<my_endpoint>.openai.azure.com//openai/deployments/o1-preview-standard/chat/completions?api-version=2024-08-01-preview'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

litellm.llms.AzureOpenAI.azure.AzureOpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

httpx.HTTPStatusError: Client error '400 model_error' for url 'https://<my_endpoint>.openai.azure.com//openai/deployments/o1-preview-standard/chat/completions?api-version=2024-08-01-preview'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

litellm.llms.AzureOpenAI.azure.AzureOpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

  File "/Users/Code/test/code_interpreter_tool.py", line 38, in main
    crew_result = crew.kickoff(
                  ^^^^^^^^^^^^^
  File "/Users/Code/test/code_interpreter_tool.py", line 48, in <module>
    main()
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

And the console prints:

2024-11-19 20:27:55,932 - 8390817856 - llm.py-llm:161 - ERROR: LiteLLM call failed: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

Things I’ve tried:

  1. Upgrading to crewai version 0.79.4, same error occurred.

  2. Changing the llm on my agent from: “azure/o1-preview-standard” to “azure/gpt-4o-standard”. This worked perfectly, I received answer 610.

This seems to be a problem with how crewai configures litellm, is it a bug or am I doing something wrong?

1 Like

@zorx Please edit the question and add the code.

The OpenAI o1 model doesn’t support the system message. If you didn’t add it and got this error, then it’s a bug in the CrewAI SDK. Otherwise, remove it.

Wrong:

from litellm import completion 

response = litellm.completion(
    model="o1-mini",
    messages=[
        { "role": "system", "content": "You're a helpful assistant." },
        { "role": "user", "content": "Hey, how's it going?" },
    ],
)

print(response)

Correct:

from litellm import completion 

response = litellm.completion(
    model="o1-mini",
    messages=[
        { "role": "user", "content": "Hey, how's it going?" },
    ],
)

print(response)

Sorry, I added a complete example that reproduces the error with the full error output.
Note that I have deployed my models using Azure AI Studio so to fully reproduce the example you must deploy in Azure as well.

I am not using completion and messages, I am using Crew, Agent, and Task.

This is totally expected.

Have you tried using the CrewAI LLM class? Does the error still persist?

Setting

llm=“o1-mini”

in the Agent and

#.env OPENAI_API_KEY=<my_key>

works without the error, returning 610.

But I want to use the Azure AI Studio deployment, not openai directly.

I meant using the LLM class as follows:

from crewai import LLM

my_llm = LLM(
    ...
)

agent = Agent(
    llm=my_llm,
    ...
)

As far as I know, you can use Azure AI Studio because the CrewAI LLM class leverages LiteLLM in the background. See the LiteLLM docs on how to use Azure AI Studio.

Maybe @tonykipkemboi can elaborate on that issue?

Thanks, I tried setting the following in my Agent llm:

from crewai import LLM

my_llm=LLM(model="azure/o1-preview-standard")

agent = Agent(
    llm=my_llm,
    ...
)

But the same error I posted on initially occurred.

We are trying to do the same and getting the same error. Looks like crewai in the backstage configure the completition call adding the role in a wrong for o1 models api.
will this problem solved in the future ?
Can you officially state that Azure o1-mini and o1-preview are supported ?

@zorx @Andrea_Terzani Will let the CrewAI staff know about the issue and get back to you.

Thanks, please refer to this issue on GitHub: [BUG] Azure O1 model causes crew.kickoff() error: BadRequestError: litellm.BadRequestError: AzureException - Error code: 400 - “Unsupported value: ‘messages[0].role’ does not support ‘system’ with this model.” · Issue #1633 · crewAIInc/crewAI · GitHub