Bugs on new version of crewai

using the anthropic custom model gives the following error, i think there should be a bug for parsing:

File "/Users/jackdu/Library/Caches/pypoetry/virtualenvs/ppt-hh63XJ59-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 970, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jackdu/Library/Caches/pypoetry/virtualenvs/ppt-hh63XJ59-py3.12/lib/python3.12/site-packages/litellm/main.py", line 2847, in completion
    raise exception_type(
  File "/Users/jackdu/Library/Caches/pypoetry/virtualenvs/ppt-hh63XJ59-py3.12/lib/python3.12/site-packages/litellm/main.py", line 838, in completion
    model, custom_llm_provider, dynamic_api_key, api_base = get_llm_provider(
                                                            ^^^^^^^^^^^^^^^^^
  File "/Users/jackdu/Library/Caches/pypoetry/virtualenvs/ppt-hh63XJ59-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/get_llm_provider_logic.py", line 507, in get_llm_provider
    raise litellm.exceptions.BadRequestError(  # type: ignore
litellm.exceptions.BadRequestError: litellm.BadRequestError: GetLLMProvider Exception - 'ChatAnthropic' object has no attribute 'split'

original model: model='claude-3-5-sonnet-20240620' max_tokens=3000 anthropic_api_url='https://api.anthropic.com' anthropic_api_key=SecretStr('**********') _client=<anthropic.Anthropic object at 0x1551012e0> _async_client=<anthropic.AsyncAnthropic object at 0x155d88e00>
An error occurred while running the crew: Command '['poetry', 'run', 'run_crew']' returned non-zero exit status 1.
2 Likes

Does CrewAI have a test/qa pipeline within its deployment process?
What published commitment does CrewAI have to backwards compatability?
Does CrewAI publish such as ‘breaking changes’ between versions

I believe all are relevant to anyone who intends to use/deploy CrewAI

1 Like

facing the same issue here!

1 Like