but R1 is the only one running for both agent and if I switch both model inside agents .
Llama 64k is taking control .
Any suggestions, thanks.
PS: I’m using LM STUDIO
when using LM Studio, for now I would recommend using the “openai/{model}” base path.
I tested it the same way as you did, using 2 agents with 1 individual LLM each. It worked as expected. Agent “researcher” is using llm_llama and Agent “reporting_analyst” is using llm_r1 (verified by monitoring the LM Studio Model processing):
Hello igi thank’s for your reply !
I think the issue come from crewai or litelm because as I said when I’m using Lm studio and Ollama, it not resolve the problem.
Moreover, I tried your suggestion, even using your models but nothing change.
can you share your crewai version please ?
I tried with
crewai==0.100.1
crewai-tools==0.33.0
litellm==1.59.8
and also main branch:
commit 1b488b6da77dc0dc1d96d45e9ef6213b3f8eceeb
…
did you create your crew using command?:
crewai create crew <project_name>
this then automatically generates a virtual environment which needs to be activated by the command:
source .venv/bin/activate
then you execute the command:
crewai run
correct?
hint:
any additional pip installs need to be performed when this virtual environment is activated. Otherwise when running command “crewai run” these packages cannot be located because “crewai run” switches to the virtual environment.
I think I managed to reproduce your issue…
[edit]
On my machine, this issue seems to occur randomly because, after shutting down LM Studio Server as well as restarting the whole LM Studio app, everything works fine again. I cannot reproduce this behavior anymore.
My project setup is little bit different because I’m using a flow when I created it by doing the create flow command, and I’m using conda where I had a venv already created for other crewai project
I follow instruction in the docs: crewai install in my flow repo uv run kickoff or crewai flow kickoff both make my first llm used run for both agents
tell me if I did it correctly. Otherwise, I’ll restart from beginning