Hi!
By default, the OpenRouter’s API makes GROK 4 Fast non-reasoning, and we can provide a enabled parameter to request if we wish to enable reasoning. We can also provide effort.
But I can’t figure out how to provide those OpenRouter additional params to the final API call. My crew uses the standard CrewAI LLM class.
And the second question. How do I change the GPT 5 reasoning effort?
Thanks in advance!
In theory, you should do the following to enable (and adjust the level of) reasoning for Grok 4 Fast on OpenRouter:
from crewai import LLM
import os
os.environ["OPENROUTER_API_KEY"] = ""
llm = LLM(
model="openrouter/x-ai/grok-4-fast",
temperature=0.7,
reasoning_effort="high" # or "low" or "medium"
)
print(
llm.call(
"Is the number 0.999... (with the 9s repeating forever) equal to 1, "
"or is it less than 1? Justify your answer."
)
)
The problem is, to my surprise, it seems like Grok 4 Fast is just too good at reasoning, even without the reasoning_effort parameter. So, I asked this question without passing that parameter. And, upon checking the OpenRouter logs, look what it says:
Tokens: 191 prompt → 666 completion, incl. 324 reasoning
See that? Apparently, it included reasoning tokens even with reasoning turned off.
2 Likes
Thanks a lot! Looking at my logs, I can see that you’re right. Perhaps OpenRouter has made some sort of mistake on their side.
1 Like
You can configure reasoning for both models directly in the CrewAI LLM class. For Grok 4 Fast, use the reasoning_effort parameter (“low”, “medium”, or “high”) to enable and control it . For GPT-5, the same reasoning_effort parameter should be used to adjust the reasoning level .