Hello everyone, I would like some help to build agents that use a custom LLM service.
I want to use a microservice that has a custom API to make requests to proprietary LLMs (OpenAI, Claude, Gemini, etc.).
This microservice has its own authentication, which makes it necessary to add a custom header and thus build a request instead of using the crewai.LLM method.
I tried to use the CustomLLM from litellm, but I got the following result:
from litellm import CustomLLM
class MyCustomLLM(CustomLLM):
    def completion(self, *args, **kwargs) -> litellm.ModelResponse:
      server_url = "https://localhost:8080/orchestration-api"
      headers = {
          "Authorization": f"Bearer {os.environ.get('ORCHESTRATION_API')}"
      }
      payload = {
          "model": kwargs["model"],
          "messages": kwargs["messages"]
      }
      response = requests.post(server_url, headers=headers, json=payload)
      response_data = response.json()
      print(response_data)
      response_data = response_data["choices"][0]["message"]["content"].strip()
      
      # .. convert to litellm.ModelResponse() and return
      return model_response
my_custom_llm = MyCustomLLM()
litellm.custom_provider_map = [
      {"provider": "my-custom-llm", "custom_handler": my_custom_llm}
  ]
However, when making the request to the agent, I receive the following error:
Has anyone implemented something similar?
