Portkey Integration llm Fails -- CrewAI keeps asking for OpenAI Key

I am following the instructions provided on:

But even configuring
PORTKEY_API_KEY=***********************************
PORTKEY_VIRTUAL_KEY=anthropic-**********

Error with:
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.

ERROR:root:LiteLLM call failed: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Error during LLM call: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
An unknown error occurred. Please check the details below.
Error details: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

No matter what, CREWAI just keeps asking the OpenAI key. What’s going on?

Code:

from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL
from typing import Dict, Any
from crewai import LLM
import os


def init_portkey_llm(
    model: str,
    portkey_config: Dict[str, Any] = None,
) -> LLM:
    """Initialize LLM with Portkey integration"""

    # Get API keys from environment variables
    portkey_api_key = os.getenv("PORTKEY_API_KEY")
    virtual_key = os.getenv("PORTKEY_VIRTUAL_KEY")

    if not portkey_api_key or not virtual_key:
        raise ValueError(
            "PORTKEY_API_KEY and PORTKEY_VIRTUAL_KEY must be set in environment variables"
        )

    # Use provided config or empty dict if None
    config = portkey_config or {}

    # Configure LLM with Portkey integration
    llm = LLM(
        model=model,
        base_url=PORTKEY_GATEWAY_URL,
        api_key="dummy",  # Using Virtual key instead
        extra_headers=createHeaders(
            api_key=portkey_api_key,
            virtual_key=virtual_key,
            config=config,
        ),
    )

    return llm

This topic was automatically closed after 30 days. New replies are no longer allowed.