Code:
from crewai import Crew, Agent, Task, LLM
bedrock_llm = LLM(model=“bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0”)
bedrock_agent = Agent(
role=“Query Execution Agent”,
goal=dedent(“”“”“”),
backstory=“”,
verbose=True,
tools=[execute_query_tool],
llm=bedrock_llm,
allow_delegation=False,
)
I’ve also tried this:
from langchain_aws import ChatBedrock
boto_config = Config(
retries=dict(
max_attempts=10,
total_max_attempts=25,
)
)
bedrock_agent_runtime_client = boto3.client(
“bedrock-agent-runtime”, config=boto_config, region_name=“us-east-1”
)
bedrock_llm: ChatBedrock = ChatBedrock(
model_id=“bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0”,
model_kwargs=dict(temperature=0.5,max_tokens=4096),
client=bedrock_agent_runtime_client,
region_name=“us-east-1”,
)
bedrock_agent = Agent(
role=“Query Execution Agent”,
goal=dedent(“”“”“”),
backstory=“”,
verbose=True,
tools=[execute_query_tool],
llm=bedrock_llm,
allow_delegation=False,
)
My app is running in ECS fargate so it should be using the short term credentials and IAM role assigned to the task.
I am getting an error message due to the OPENAI_API_KEY environment variable not being set.
Do I need to set this environment variable if I’m not using OpenAI?
What is the correct way to use AWS Bedrock as the model provider for my agents? The documentation isn’t super helpful here.
crewai 0.67.1
crewai-tools 0.12.1