Hi, I am trying to setup a crew with Gemini as manager but no success. It seems like it is not getting the Gemini key but it works for the other agents
financial_trading_crew = Crew(
agents=[data_analyst_agent,
trading_strategy_agent,
execution_agent,
risk_management_agent],
tasks=[data_analyst_task,
strategy_developement_task,
execution_planning_task,
risk_assessment_task],
manager_llm=ChatGoogleGenerativeAI(model="google/gemini-2.0-flash",verbose= True, temperature=0.1, google_api_key=os.environ["GEMINI_API_KEY"]),
process=Process.hierarchical,
)
Error during LLM call: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=model=‘models/google/gemini-2.0-flash’ google_api_key=SecretStr(‘**********’) temperature=0.1 client=<google.ai.generativelanguage_v1beta.services.generative_service.client.GenerativeServiceClient object at 0x7fd2c9acd840> default_metadata=()
Pass model as E.g. For ‘Huggingface’ inference endpoints pass in completion(model='huggingface/starcoder',..)
Learn more: Providers | liteLLM
verbose=True