Slow response from CrewAi to OpenAI

Hi,

Was wondering when I run my crewAI agents why the time taken to get a response from OpenAI is so slow?
When I run the same data, same context directly in OpenAI the response is immediate, when I call via crewAI the response tends to take a couple of minutes.

Is there something I am doing wrong or a faster way to execute the API results?

Thanks.

When using CrewAI there’s an initialization and instantiation cost that you endure that’s on the framework level. The time the framework is “booting up” makes response time longer. When calling the API directly the framework delay is not there.

If your crews/agents are not using any tools try using LiteLLM directly, it’s faster