Help with AWS Bedrock LLM connectivity

I have used the following script to initialize the llm using AWS Bedrock :

from llama_index.core.settings import Settings
from llama_index.llms.bedrock import Bedrock
from llama_index.embeddings.bedrock import BedrockEmbedding
from crewai import Agent, Crew, Process, Task , LLM

region_name=“us-east-1”
model=“bedrock/mistral.mixtral-8x7b-instruct-v0:1”

llm = LLM(Bedrock(model=model, region_name=region_name,temperature=0,max_tokens=4000,context_size=32000,timeout=240))

But, I get the below error :
raise ValueError(f"Provider {provider_name} for model {model} is not supported")
ValueError: Provider bedrock/mistral for model bedrock/mistral.mixtral-8x7b-instruct-v0:1 is not supported

Can someone pls help me in debugging this?

I even tried the following :

#from .crews.poem_crew.poem_crew import PoemCrew
from llama_index.core.settings import Settings
from llama_index.llms.bedrock import Bedrock
from llama_index.embeddings.bedrock import BedrockEmbedding
from crewai import Agent, Crew, Process, Task , LLM

region_name=“us-east-1”
model=“bedrock/mistral.mixtral-8x7b-instruct-v0:1”
#embed_model_name = “amazon.titan-embed-text-v2”

llm = LLM(model=model, region_name=region_name,temperature=0,max_tokens=4000,context_size=32000,timeout=240 , verbose = True)

But, I get the below error :
ERROR: LiteLLM call failed: litellm.APIConnectionError: BedrockException - {“Message”:“User: arn:aws:sts::xxxxxxxxxxxxxxxx is not authorized to perform: bedrock:InvokeModel on resource: arn:aws:bedrock:us-west-2::foundation-model/mistral.mixtral-8x7b-instruct-v0:1 with an explicit deny in a service control policy”}

Check your IAM permissions. As stated in the Bedrock docs:

NotAuthorized

HTTP Status Code: 400

Cause: You do not have permission to perform this action

Solution:

  • Review your IAM permissions and ensure you have the necessary rights to perform the requested action on Amazon Bedrock resources
  • If you are using an IAM role, verify that the role has the appropriate permissions and trust relationships
  • Check for any organizational policies or service control policies that might be restricting your access

Hi @rokbenko , I have access to llm present in region_name=“us-east-1”.
and I have set the parameters accordingly. But the error which I am getting mentions that the model is referring to some other region(us-west-2) where I do not have access. Can u pls let me know what am I missing in my script?

@debjani, please set your environment variables like so in a .env file:

# AWS Credentials for Bedrock
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION_NAME=us-east-1
MODEL=bedrock/mistral.mixtral-8x7b-instruct-v0:1

Then instantiate it like so in the llm variable:

llm = LLM(
          model=os.getenv('MODEL'),
          aws_access_key_id=os.getenv('AWS_ACCESS_KEY_ID'),
          aws_secret_access_key=os.getenv('AWS_SECRET_ACCESS_KEY'),
          aws_region_name=os.getenv('AWS_REGION_NAME')
)
1 Like