Hi CrewAI team,
I’m configuring agents using the agents.yaml
file. Here’s a sample agent using Ollama:
user_journey_agent:
role: >
User Journey Designer
goal: >
An AI agent that will design learning journeys and goals for the user based on specific topic inputs.
backstory: >
This agent anticipates the user's educational goals, proficiency needs, and preferred learning style to craft a structured learning journey.
llm: ollama/llama3.1
In Python, I usually initialise the LLM like this:
LLM(model="ollama/llama3.1", base_url="http://localhost:11434", temperature=0.01, seed=42)
How can I pass parameters like seed
, temperature
, when using the YAML config approach?
Is there an extended syntax or a separate YAML section where these should be specified?
By default, I can see CrewAI uses environment variables such as:
MODEL=ollama/llama3.1
API_BASE=http://localhost:11434
However, adding something like SEED=42 to .env doesn’t seem to affect the output or behaviour.
Appreciate your guidance