As far as I know, there’s no Flow.kickoff_for_each()
or Flow.kickoff_for_each_async()
method. So, you can simply implement your own batch processing (either sequential or parallel). Personally, I’d prefer to implement it using a ThreadPool
.
Take a look at how CrewAI itself defines its Crew.kickoff_async()
(in crewai/crew.py
):
async def kickoff_async(self, inputs: Optional[Dict[str, Any]] = {}) -> CrewOutput:
"""Asynchronous kickoff method to start the crew execution."""
return await asyncio.to_thread(self.kickoff, inputs)
And asyncio.to_thread
is nothing more than a ThreadPoolExecutor
over which you have no control. So, I think it’s much better to have control over your batch processing yourself (allowing you to set limits on how many threads are active at the same time to avoid excessive requests to your LLM). Also, call me old-fashioned, but I find the mental model of ThreadPool
s much more intelligible than all the asyncio
machinery, and our home processors have more than enough power to run plenty of threads.
As for processing one input per Flow
, well, that’s an optimization question and will fundamentally depend on the size (and complexity) of each input for your chosen LLM. It’s known that LLMs suffer from what has been dubbed Context Rot, so the processing capability (in terms of quality) degrades rapidly as the context size (prompt) increases. You’ll have to experiment and find a trade-off between maximizing the processed quantity and the processing quality.
In that case, my advice is to gradually increase the number of inputs for each call. Pass a list of inputs and give each one a unique ID (so the LLM can use this identifier as a key), ask the LLM to perform the processing, and return a list like: [{input_id: "A1B2C3", result: "I hate humans..."}, {...}, {...}]
. This way, you can keep track of each input-output pair and can process multiple inputs in the same call. Combining this with the multiprocessing approach mentioned above, you could get some very interesting results.
I hope these thoughts help. Good luck.