Crewai and LM studio authentication issue

Anyone has a working python script to interact with a locally installed LM Studi 0.3.5 from crewai? I am getting tons of errors based in the inability to connect to LM studio. Its working when testing with import openai but not import the crewai stack. Python 3.12 and pip & crew the latest version.

iv’e been able to use ollama / gemma installed locally on LMStudio. Please share your error here and ill’ try to see if i can help.

if you are creating an instance of LLM with the config to hit localhost then if you provide api_key = “some random string” should work without issues.

Something like:

my_llm = LLM(model=“lm_studio/llama-3.2-3b-instruct”, base_url=“http://127.0.0.1:1234/v1”, api_key=“asdf”)

1 Like

Since LiteLLM is used as a provider, you should be able to set the environment variables in .env like this:

MODEL=lm_studio/hermes-3-llama-3.1-8b
OPENAI_API_KEY='openai-api-key'
OPENAI_BASE_URL='http://localhost:1234/v1'

Got it working. Thanks a lot.