How to Set Custom LLM Parameters (like `temperature`, `seed`) in YAML Config for Ollama

Hi CrewAI team,

I’m configuring agents using the agents.yaml file. Here’s a sample agent using Ollama:

user_journey_agent:
  role: >
    User Journey Designer
  goal: >
    An AI agent that will design learning journeys and goals for the user based on specific topic inputs.
  backstory: >
    This agent anticipates the user's educational goals, proficiency needs, and preferred learning style to craft a structured learning journey.
  llm: ollama/llama3.1

In Python, I usually initialise the LLM like this:

LLM(model="ollama/llama3.1", base_url="http://localhost:11434", temperature=0.01, seed=42)

How can I pass parameters like seed, temperature, when using the YAML config approach?
Is there an extended syntax or a separate YAML section where these should be specified?
By default, I can see CrewAI uses environment variables such as:

MODEL=ollama/llama3.1
API_BASE=http://localhost:11434

However, adding something like SEED=42 to .env doesn’t seem to affect the output or behaviour.
Appreciate your guidance

The issue has been resolved by passing llm in crew.py as

from crewai import Agent, Crew, Process, Task, LLM
from crewai.project import CrewBase, agent, crew, task

llm = LLM(
    model="ollama/llama3.1",
    base_url="http://localhost:11434",
    temperature=0.01,
    seed=42
)
@CrewBase
class LearningAssistant():
    """LearningAssistant crew"""

    agents_config = 'config/agents.yaml'
    tasks_config = 'config/tasks.yaml'

    @agent
    def user_journey_agent(self) -> Agent:
        return Agent(
            config=self.agents_config['user_journey_agent'],
            llm = llm,
            verbose=True
        )

I have ensured to hash everything from the.env and retried with this method so that we can ensure the response is the same even after repeated attempts