I am testing the custom_system_template and custom_prompt_template parameters with my agents.
I was surprised when I inspected the message sent to the LLM. CrewAI still sends prompts using the default templates.
This is my implementation. I believe is correct:
from crewai import Agent, Crew, Process, Task, LLM
from crewai.project import CrewBase, agent, crew, task
from crewai.agents.agent_builder.base_agent import BaseAgent
from typing import List
# Configuración de LLMs
llm = LLM(
model="openai/gpt-4.1-mini",
temperature=0.7,
)
@CrewBase
class TranslatorCrew:
agents: List[BaseAgent]
tasks: List[Task]
# YAML configuration files
agents_config = "config/agents_translator.yaml"
tasks_config = "config/tasks_translator.yaml"
# Custom templates
custom_system_template = """You are {role}. {backstory}
Your goal is: {goal}
Respond naturally and conversationally. Focus on providing helpful and accurate information."""
custom_prompt_template = """
Task: {input}
Please complete this task carefully and professionally.
Additional context:
- Use a professional but accessible tone
- Maintain consistency in style
- Ensure content is clear and understandable
"""
@agent
def translator(self) -> Agent:
return Agent(
config=self.agents_config["translator"],
system_template=self.custom_system_template,
prompt_template=self.custom_prompt_template,
max_iter=3,
llm=llm,
verbose=True
)
@agent
def optimizer(self) -> Agent:
return Agent(
config=self.agents_config["optimizer"],
# This agent does NOT have custom templates for comparison
max_iter=3,
llm=llm,
verbose=True
)
@task
def translation_task(self) -> Task:
return Task(
config=self.tasks_config["translation_task"],
agent=self.translator()
)
@task
def optimization_task(self) -> Task:
return Task(
config=self.tasks_config["optimization_task"],
agent=self.optimizer(),
context=[self.translation_task()]
)
@crew
def crew(self) -> Crew:
"""Create the Crew for translation and optimization"""
return Crew(
agents=self.agents,
tasks=self.tasks,
process=Process.sequential,
memory=False,
planning=False,
verbose=True,
)
This is what the documentation says: Customizing Prompts - CrewAI
However, when I inspect the output in a tool such as AgentOps, I still see the default templates.
If I am not mistaken, CrewAI receives these parameters from here: crewAI/src/crewai/translations/en.json at main · crewAIInc/crewAI · GitHub
"task": "\nCurrent Task: {input}\n\nBegin! This is VERY important to you, use the tools available and give your best Final Answer, your job depends on it!\n\nThought:", -> custom_promp_template
"lite_agent_system_prompt_without_tools": "You are {role}. {backstory}\nYour personal goal is: {goal}\n\nTo give my best complete final answer to the task respond using the exact following format:\n\nThought: I now can give a great answer\nFinal Answer: Your final answer must be the great and the most complete as possible, it must be outcome described.\n\nI MUST use these formats, my job depends on it!", -> custom_system_template
I have not modified this file in my project because this file applies to the entire Crew. I want to apply my configuration to a specific agent as indicated in the documentation.
I have also created a script to inspect the complete messages passed to the LLM from my terminal as indicated in the documentation. These messages also show the default templates sent by CrewAI.
Is my implementation correct? Thanks for your help!