Hi,
I am using crewai version 0.65.2 and wanted to test the o1 model.
Using the Azure AI Studio I deployed an o1 model called:
o1-preview-standard
And also deployed a gpt-4o model called:
gpt-4o-standard
Here is a complete example:
from __future__ import annotations
# .env variables setting:
# AZURE_API_BASE=https://<my_endpoint>.openai.azure.com/
# AZURE_API_KEY=<my_key>
from crewai import Agent, Crew, Task
from crewai_tools import CodeInterpreterTool
code_interpreter_tool = CodeInterpreterTool()
def main():
agent = Agent(
role="Best Mathematician Agent",
goal="Calculate the Fibonacci number.",
backstory="""
You are an expert mathematician.
You are trying to calculate the Fibonacci number.
""",
tools=[code_interpreter_tool],
verbose=True,
llm="azure/o1-preview-standard",
)
task = Task(
name="Calculate Fibonacci Number",
description="""
Calculate the Fibonacci number for n={n} by generating Python code for it and running it with your tools.
""",
expected_output="The Fibonacci number integer value.",
agent=agent,
)
crew = Crew(
tasks=[task],
agents=[agent],
verbose=True,
)
crew_result = crew.kickoff(
inputs={
"n": "15",
}
)
print(crew_result.raw)
if __name__ == "__main__":
main()
When running it I received the following error:
Exception has occurred: BadRequestError
litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
httpx.HTTPStatusError: Client error '400 model_error' for url 'https://<my_endpoint>.openai.azure.com//openai/deployments/o1-preview-standard/chat/completions?api-version=2024-08-01-preview'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
litellm.llms.AzureOpenAI.azure.AzureOpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
httpx.HTTPStatusError: Client error '400 model_error' for url 'https://<my_endpoint>.openai.azure.com//openai/deployments/o1-preview-standard/chat/completions?api-version=2024-08-01-preview'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
litellm.llms.AzureOpenAI.azure.AzureOpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
httpx.HTTPStatusError: Client error '400 model_error' for url 'https://<my_endpoint>.openai.azure.com//openai/deployments/o1-preview-standard/chat/completions?api-version=2024-08-01-preview'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
litellm.llms.AzureOpenAI.azure.AzureOpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
During handling of the above exception, another exception occurred:
File "/Users/Code/test/code_interpreter_tool.py", line 38, in main
crew_result = crew.kickoff(
^^^^^^^^^^^^^
File "/Users/Code/test/code_interpreter_tool.py", line 48, in <module>
main()
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
And the console prints:
2024-11-19 20:27:55,932 - 8390817856 - llm.py-llm:161 - ERROR: LiteLLM call failed: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
Things I’ve tried:
-
Upgrading to crewai version 0.79.4, same error occurred.
-
Changing the llm on my agent from: “azure/o1-preview-standard” to “azure/gpt-4o-standard”. This worked perfectly, I received answer 610.
This seems to be a problem with how crewai configures litellm, is it a bug or am I doing something wrong?