raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {‘error’: {‘code’: ‘invalid_argument’, ‘message’: ‘the role of last message must be user or tool’, ‘type’: ‘invalid_request_error’}, ‘id’: ‘as-kk6jtebwme’}
This seems to be an issue with the parameters when calling the model. Please check if the parameters are correct, as well as how Litellm is calling the Baidu Qianfan model.
I think that’s an issue with the CrewAI SDK not handling Baidu Qianfan correctly. Investigate the source code.
Looks similar to this: