Hi, I was running this crewai example with the latest version of crewai (crewai==0.76.2, crewai-tools==0.13.2, langchain==0.3.4) and I have internet searching function as a tool taking in a query argument (seen in the image below).
The issue is that my LLM (OpenAI model (gpt-4o-2024-08-06)) is always returning argument as name instead of query in its response (e.g., {name: 'ACTUAL_SEARCH_QUERY_FROM_LLM'}). I am not sure if the LLM is misinterpreting the name argument in the tool arguments section (highlighted in the picture) as the function argument.
Can anyone help with (1) understanding where this tool definition in the prompt is coming from and (2) how to resolve this error (to have llm output QUERY instead of NAME)?
Thanks.
Hi @tonykipkemboi , almost similar, the only difference being the way I am setting the model. I am setting the model name as an environment variable using OPENAI_MODEL_NAME which I guess CrewAI looks for. I am using the latest gpt-4o.
And the crewAI library version (0.76.2) is the latest (as of writing the original thread) as well.
I think you are right…At least the _render_text_description_and_args method is expected to do so (based on its docstrings, seen below). Do you think its a minor bug in the code @tonykipkemboi ?
Thanks.