How do LLM Connections inject API keys into the crew process?

Subject: How do LLM Connections inject API keys into the crew process?

I’ve configured an OpenAI LLM Connection in the dashboard. My crew still fails with AuthenticationError: Incorrect API key: sk-placeholder.... I can confirm:

  • os.environ.get("OPENAI_API_KEY") returns my placeholder at both import time and before_kickoff

  • Removing explicit llm=LLM(...) from agents doesn’t help — the default LLM also uses the placeholder

  • load_dotenv() at the top of crew.py doesn’t resolve it

  • Tool validations in the logs show “OpenAI: Successfully validated tool X” — so something is working

Question: Does an LLM Connection inject the real API key into os.environ? If not, how should I reference it in crew.py?

@Fugsy Try this:

The key issue: If your .env file contains a placeholder like sk-placeholder, it will override the platform-injected value if load_dotenv() runs after the platform sets the real key. Your code is likely reading the placeholder from .env instead of the platform-provided value.

To fix this:

  1. Remove the placeholder OPENAI_API_KEY from your .env file entirely, or don’t call load_dotenv() in deployed code

  2. Don’t hardcode api_key in LLM(...) — let CrewAI pick it up from the environment variable that the platform injects

  3. Verify your environment variables are correctly configured in the AMP dashboard under LLM Connections or the deployment’s environment variables section