SageMaker endpoint w/ 0.6x+ versions of CrewAI (LiteLLM)

The newer versions of CrewAI are devoid of LangChain model adapters – it has been replaced w/ LiteLLM. I have attempted to adapt according to the CrewAI and LiteLLM docs but I get an error back from AWS:

HTTP 424: Dependency failure.

Maybe we could suggest to the core team that they involve the community by way of giving us advanced notice of breaking changes. At least then we can decide if we want to update or not @matt ?

I understand that CrewAI is relatively new, but if this keeps happening I fear that people will start to abondon CrewAI for other alternatives. Brand image, online-visibility via this discorse is not a good advert!

This is my first time trying a < 1.x-version library. So I don’t know if breaking changes in even for minor releases are par for the course.

AWS outages gone but 424s persist. Reverting to < 0.6