I have encapsulated a basic CrewAI Flow
inside a FastAPI endpoint, but I’m encountering unexpected behavior: exceptions raised in the generate_fun_fact
method (decorated with @listen
) are not properly propagated. As a result, my /process_email
endpoint incorrectly returns a 200
status even in cases where it should fail.
Here’s a minimal reproducible example:
from crewai.flow.flow import Flow, listen, start
from litellm import completion
class ExampleFlow(Flow):
model = "gpt-4o-mini"
@start()
async def start(self):
pass
@listen(start)
async def generate_fun_fact(self):
try:
assert False # This should raise an AssertionError
except Exception as e:
print(f"Error in generate_fun_fact: {e}")
raise
@router.post("/process_email/{uuid}")
async def process_email(uuid: str, request: Request):
try:
_result = await ExampleFlow().kickoff_async()
return 'Workflow Completed Successfully'
except Exception as e:
log.error(f"Unexpected error in test: {str(e)}")
raise EmailProcessingException(message=str(e))
The Issue:
- When an exception is raised inside a method decorated with
@listen
, it does not propagate correctly, causing the FastAPI endpoint to return200 OK
instead of failing as expected. - However, when the method raising the exception is decorated with
@start
, the exception is properly handled.
Question:
- Is this a bug in CrewAI?
- Does anyone know of a workaround to ensure exceptions inside
@listen
handlers propagate correctly?