Error when using the GithubSearchTool: Arguments validation failed: 1 validation error for FixedGithubSearchToolSchema

I get this using githubsearchtool, it fails to get the actual data after some iteration and then it goes to the next agent. i tried to added search_query into tools=[githubtool(search_query)] where it contains a the query string but then i get another error where it mention that the tool isn’t callable.

Error: I encountered an error while trying to use the tool. This was the error: Arguments validation failed: 1 validation error for FixedGithubSearchToolSchema
search_query
Input should be a valid string [type=string_type, input_value={‘description’: ‘Search t…rmation’, ‘type’: ‘str’}, input_type=dict]
For further information visit Redirecting....
Tool Search a github repo’s content accepts these inputs: Tool Name: Search a github repo’s content
Tool Arguments: {‘search_query’: {‘description’: “Mandatory search query you want to use to search the github repo’s content”, ‘type’: ‘str’}}

Code:
githubtool = GithubSearchTool(
github_repo=github_r,
gh_token=GITHUB_TOKEN,
content_types=[‘code’, ‘repo’],
config=dict(
llm=dict(
provider=“groq”,
config=dict(
model=“llama3-8b-8192”,
base_url=“https://api.groq.com/openai/v1”,
api_key=GROQ_API_KEY,
temperature=0.4,
)
),
embedder=dict(
provider=“huggingface”,
config=dict(
model=“izhx/udever-bloom-1b1”,
),
),
)
)

.
.
.
@agent
def Retriever(self) → Agent:
return Agent(
config=self.agents_config[‘Retriever’],
verbose=True,
tools=[githubtool],
llm=self.llm
)

Try to change the LLM to a more capable one. Small LLMs, like Llama 3 8B, sometimes struggle to work with CrewAI. Try llama3-70b-8192. If the issue persists, then switch to a newer model family like Llama 3.3. Llama 3 is quite old.

So i tested the GithubSearchTool outside the scope of the agents run & it works fine with the same model “llama3-8b”, tried also “gemma2-9b-it” in the GithubSearchTool within the agent scope and it did work actually, so maybe a part of it is related to the llama model cuz of it’s an old one.

@Imed.k So, is the issue now solved? Can I close this post?