Function_calling_llm not working

it seems like crewai is ignoring function_calling_llm parameter directly and when you put it in the yaml file it throws an error. (specifying in the yaml, the same llm as in llm: gpt4o) works fine. just doesn’t work as function_calling_llm: gpt4o and no other model does either. anyone else seeing this?

for example this in the yaml file,
function_calling_llm: gpt4o

and this in the def of that llm,
@llm
def gpt4o(self):
return LLM(model=“openai/gpt-4o”, temperature=0.2, top_p=0.2)

results in,
ile “/Users/ai/anaconda3/envs/crewaitestbed/lib/python3.11/site-packages/crewai/project/crew_base.py”, line 37, in init
self.map_all_agent_variables()
File “/Users/ai/anaconda3/envs/crewaitestbed/lib/python3.11/site-packages/crewai/project/crew_base.py”, line 89, in map_all_agent_variables
self._map_agent_variables(
File “/Users/ai/anaconda3/envs/crewaitestbed/lib/python3.11/site-packages/crewai/project/crew_base.py”, line 121, in _map_agent_variables
self.agents_config[agent_name][“function_calling_llm”] = agents[
^^^^^^^
KeyError: ‘gpt4o’

Pretty sure you cannot add function calling llm via YAML as there is no annotation for it

I thought all standard attributes of agents were allowed in the yaml file? If not then which ones are?