|
About the LLMs category
|
|
1
|
154
|
September 9, 2024
|
|
How to disable SSL validation for those who are behind a company VPN
|
|
0
|
13
|
November 5, 2025
|
|
Anyone managed to use mcp server with ollama model provider?
|
|
3
|
320
|
July 25, 2025
|
|
Use local enterprise LLMs
|
|
5
|
104
|
July 14, 2025
|
|
"Fallback" LLM Configuration
|
|
5
|
225
|
June 10, 2025
|
|
Help ::: How to use a custom (local) LLM with vLLM
|
|
2
|
749
|
June 10, 2025
|
|
Gemini 2.5 Flash Preview and Gemma3:1b/27b, Big Difference in Output for same task definition
|
|
2
|
129
|
May 29, 2025
|
|
Bedrock: Claude 3.7 sonnet
|
|
3
|
157
|
May 16, 2025
|
|
Handling LLM Errors in Hierarchical CrewAI Process with Callbacks
|
|
9
|
445
|
April 14, 2025
|
|
Portkey Integration llm Fails -- CrewAI keeps asking for OpenAI Key
|
|
1
|
124
|
April 20, 2025
|
|
Issue with "gemini-2.0-flash-exp" model returning an error
|
|
4
|
254
|
March 27, 2025
|
|
LLM Response Error: ValueError: Invalid response from LLM call - None or empty
|
|
5
|
936
|
April 5, 2025
|
|
How to use the qwen2.5-vl-3b-instruct model with the CrewAi?
|
|
3
|
624
|
April 6, 2025
|
|
Structured Output (response_format) - LiteLLM vs. CrewAI
|
|
1
|
510
|
April 9, 2025
|
|
Problem with using locally deployed custom llm
|
|
2
|
254
|
March 23, 2025
|
|
Error using DeepSeek
|
|
1
|
271
|
March 22, 2025
|
|
Error: llm.py-llm:426 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
|
|
2
|
129
|
March 19, 2025
|
|
Nineteenai doesnt work with the usual llm setup code
|
|
1
|
24
|
March 18, 2025
|
|
Feature Request: Real-Time Multimodal Support for Agent
|
|
4
|
118
|
March 14, 2025
|
|
Crewai Chat litellm error
|
|
3
|
987
|
March 8, 2025
|
|
The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently
|
|
1
|
142
|
February 27, 2025
|
|
Using Custom LLM
|
|
1
|
253
|
February 24, 2025
|
|
Using ollama(llama3.2-vision) to extract text from image
|
|
2
|
782
|
February 11, 2025
|
|
Gemini has stopped working again!
|
|
2
|
364
|
January 22, 2025
|
|
Claude won't finish repsonses
|
|
3
|
122
|
January 29, 2025
|
|
Why am I getting the "Invalid response from LLM call - None or empty" error with my custom tool if using Anthropic LLM but not with OpenAI LLM?
|
|
14
|
1665
|
December 23, 2024
|
|
When native integrations with other LLMs?
|
|
2
|
77
|
January 9, 2025
|
|
Tried to both perform Action and give a Final Answer at the same time, I must do one or the other
|
|
8
|
871
|
November 21, 2024
|
|
litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call
|
|
2
|
1455
|
November 19, 2024
|
|
BadRequestError: litellm.BadRequestError: LLM Provider NOT provided
|
|
5
|
1946
|
November 20, 2024
|