Help with AWS Bedrock LLM connectivity

I have used the following script to initialize the llm using AWS Bedrock :

from llama_index.core.settings import Settings
from llama_index.llms.bedrock import Bedrock
from llama_index.embeddings.bedrock import BedrockEmbedding
from crewai import Agent, Crew, Process, Task , LLM

region_name=“us-east-1”
model=“bedrock/mistral.mixtral-8x7b-instruct-v0:1”

llm = LLM(Bedrock(model=model, region_name=region_name,temperature=0,max_tokens=4000,context_size=32000,timeout=240))

But, I get the below error :
raise ValueError(f"Provider {provider_name} for model {model} is not supported")
ValueError: Provider bedrock/mistral for model bedrock/mistral.mixtral-8x7b-instruct-v0:1 is not supported

Can someone pls help me in debugging this?

I even tried the following :

#from .crews.poem_crew.poem_crew import PoemCrew
from llama_index.core.settings import Settings
from llama_index.llms.bedrock import Bedrock
from llama_index.embeddings.bedrock import BedrockEmbedding
from crewai import Agent, Crew, Process, Task , LLM

region_name=“us-east-1”
model=“bedrock/mistral.mixtral-8x7b-instruct-v0:1”
#embed_model_name = “amazon.titan-embed-text-v2”

llm = LLM(model=model, region_name=region_name,temperature=0,max_tokens=4000,context_size=32000,timeout=240 , verbose = True)

But, I get the below error :
ERROR: LiteLLM call failed: litellm.APIConnectionError: BedrockException - {“Message”:“User: arn:aws:sts::xxxxxxxxxxxxxxxx is not authorized to perform: bedrock:InvokeModel on resource: arn:aws:bedrock:us-west-2::foundation-model/mistral.mixtral-8x7b-instruct-v0:1 with an explicit deny in a service control policy”}

Check your IAM permissions. As stated in the Bedrock docs:

NotAuthorized

HTTP Status Code: 400

Cause: You do not have permission to perform this action

Solution:

  • Review your IAM permissions and ensure you have the necessary rights to perform the requested action on Amazon Bedrock resources
  • If you are using an IAM role, verify that the role has the appropriate permissions and trust relationships
  • Check for any organizational policies or service control policies that might be restricting your access

Hi @rokbenko , I have access to llm present in region_name=“us-east-1”.
and I have set the parameters accordingly. But the error which I am getting mentions that the model is referring to some other region(us-west-2) where I do not have access. Can u pls let me know what am I missing in my script?

@debjani, please set your environment variables like so in a .env file:

# AWS Credentials for Bedrock
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION_NAME=us-east-1
MODEL=bedrock/mistral.mixtral-8x7b-instruct-v0:1

Then instantiate it like so in the llm variable:

llm = LLM(
          model=os.getenv('MODEL'),
          aws_access_key_id=os.getenv('AWS_ACCESS_KEY_ID'),
          aws_secret_access_key=os.getenv('AWS_SECRET_ACCESS_KEY'),
          aws_region_name=os.getenv('AWS_REGION_NAME')
)
1 Like

@tonykipkemboi I remember just adding entries in .env was sufficient in 0.86.0. has that changed?

the code that gets autogenerated with crewai create crew “name” by providing AWS details does not provide any indication that we need to create LLM instance and provide that as llm to the agent.

I tried this approach using this code "import boto3
import os

from dotenv import load_dotenv

from langchain.embeddings import BedrockEmbeddings
from langchain.llms import Bedrock

ak_access_id=os.getenv(‘AWS_ACCESS_KEY_ID’)
ak_secret=os.getenv(‘AWS_SECRET_ACCESS_KEY’)

print(“Access Key:”, ak_access_id)
print(“Secret Key:”, ak_secret)

Create a client to interact with the Bedrock service using AWS credentials

bedrock = boto3.client(service_name=“bedrock-runtime”, aws_access_key_id=ak_access_id,
aws_secret_access_key=ak_secret,
region_name=os.getenv(‘AWS_REGION_NAME’))

region_name=region_name, aws_access_key_id=ak_access_id, aws_secret_access_key=ak_secret) # ,config = Config(read_timeout=100 * 60)

Initialize BedrockEmbeddings with a specific model and the Bedrock client

bedrock_embeddings = BedrockEmbeddings(model=“amazon.titan-embed-text-v1”, client=bedrock) # =“amazon.titan-embed-text-v1” ## model_id=os.getenv(‘MODEL’)

Reinitialize the Bedrock client for language model

llm = Bedrock(model_id=“anthropic.claude-v2:1”, client=bedrock, model_kwargs={“temperature”: .1})

llm = LLM(model=os.getenv(‘MODEL’), aws_access_key_id=os.getenv(‘AWS_ACCESS_KEY_ID’),

aws_secret_access_key=os.getenv(‘AWS_SECRET_ACCESS_KEY’),

aws_region_name=os.getenv(‘AWS_REGION_NAME’)

)" but i’m getting below instruction and error after running this syntax “result = crew.kickoff(inputs={“topic”: “Artificial Intelligence”})” in visual code :-

[DEBUG]: == Working Agent: Content Planner
[INFO]: == Starting Task: 1. Prioritize the latest trends, key players, and noteworthy news on Artificial Intelligence.
2. Identify the target audience, considering their interests and pain points.
3. Develop a detailed content outline including an introduction, key points, and a call to action.
4. Include SEO keywords and relevant data or sources.

Entering new CrewAgentExecutor chain…

--------------------------------------------------------------------------- ConnectError Traceback (most recent call last) …

APIConnectionError: Connection error.

How to connect to llm using Bedrock now?

I am stuck with this issue when I upgraded my crewai from .55 to .95. I am using bedrock and have permission to us-east-1. Whenever I run the application with the correct credentials and region, the error always points to us-west-2. Looks like some additional handling is required to make it work through Litellm.