Using AWS Bedrock in crewAIEnterprise

I would like to use Bedrock (llama models) as my default LLM for my crews. So, I defined an LLM Connections, see attached screenshot:

I also defined AWS_SECRET_KEY and AWS_KEY in Environment Variables sections.

I also bedrock as my default model:

Now I have deployed " Personalized Outreach Crew" with a dummy OPENAI_API_KEY (as it would not allow me to deploy without specifying a value for this key. I am waiting for the deployment to complete.

Meanwhile, I would like to understand how I can specify to my crew/agents to use the LLM connection I specified?

Hi Chandra, can you try this

Yes, I have set these defaults also to LLM connection for AWS Bedrock.

However, I am stuck in deployment (see the other thread you responded), so unable to test it :frowning:

1 Like