Litellm throwing 308 error with OPEN AI

LLM initialization , base url is our hosted url
llm = LLM(
model=“gpt-4”,
api_key=f"sk-{alfa_claim}“,
base_url=alfa_base_url,
)
image_text_extractor = Agent(
config=agents_config[‘image_text_extractor’],
goal= f"Extract and analyze text from images efficiently using AI-powered tools. You should get the text from {image_url}”,
tools=[vision_tool],
verbose=True,
llm=llm
)

Its throwing below error
:cross_mark: LLM Call Failed │
│ Error: litellm.APIError: APIError: OpenAIException - │
│ 308 Permanent Redirect │
│ │

308 Permanent Redirect



nginx │
│ │

Issue resolve post looking into the code of vision tool, vision tool is not picking the llm config from agent and making the http call instead of https. Explicitly passed the llm config to vision tool then it was resolved.