How to set LM Studio as the LLM provider in CrewAI?

Looking for a python code snippet I can use to get a crewai agent connected to my local llm from LMstudio. Seems since langchain-openai is not supports or depreciated the code I have is not working anymore. Getting ton of errors. Tried many posts / suggestions w/o any success. One major issues seems to be with DuckDuckGoSearchRun. Running LM Studio 0.3.5 with llama 3 model loaded. Python 3.12

my_llm = LLM(model=“lm_studio/llama-3.2-3b-instruct”, base_url=“http://127.0.0.1:1234/v1”, api_key=“asdf”)

This is what works for me.

1 Like

Thank you so much Vkrishna,

Works for me – finally after a week trying. Really appreciate the support.

Here my new code:

import os

import openai

from crewai import Agent, Task, Crew, Process

from crewai import LLM

from crewai_tools import SerperDevTool

from langchain_community.tools import DuckDuckGoSearchRun

from dotenv import load_dotenv, find_dotenv

Load environment variables

load_dotenv(find_dotenv())

openai.api_key = “”

openai.api_base = “http://127.0.0.1:1234/v1/models/

my_llm = LLM(model=“lm_studio/llama-3.2-3b-instruct”, base_url=“http://127.0.0.1:1234/v1”, api_key=“asdf”)

Initialize search tools

search_tool = DuckDuckGoSearchRun()

Define agents

researcher_agent = Agent(

role=‘Senior Research Analyst’,

goal=‘Uncover cutting-edge developments in AI and data science’,

backstory=“”"You are a Senior Research Analyst at a leading tech think tank.

Your expertise lies in identifying emerging trends and technologies in AI and

data science. You have a knack for dissecting complex data and presenting

actionable insights.“”",

llm=my_llm

……

Ron

1 Like