You may have to set the planning_llm manually. Give this a try (replace the settings with your model/config):
Crew(
...
planning_llm=LLM(
model="gemini/gemini-1.5-pro-002",
api_key=GEMINI_API_KEY,
temperature=0,
)
)
CrewAI docs has the planning_llm set up slightly different in their example, using the ChatOpenAI
class langchain_openai for their LLM.
Source: Planning - CrewAI
If the first option doesn’t work, another idea would be to try a modified version of the doc example, instead using the ChatGoogleGenerativeAI
class from langchain_google_genai. Note that you may need to add API keys/required values as necessary into this class.
Source:
ChatGoogleGenerativeAI — 🦜🔗 LangChain documentation.
Example:
from langchain_google_genai import ChatGoogleGenerativeAI
...
...
Crew(
...
planning_llm=ChatGoogleGenerativeAI(model="gemini-1.5-pro")
)