I tried that hack and unfortunately just got back an OpenAI 500 error instead of a LiteLLM 400 error. You can find more detail in my reply to this topic.
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Issue with Litellm Perplexity API | 16 | 634 | March 15, 2025 | |
| Error: custom stop words are not implemented for completions | 1 | 257 | February 5, 2025 | |
| litellm.BadRequestError: OpenAIException - Unsupported parameter: 'stop' is not supported with this model | 3 | 749 | June 21, 2025 | |
| [LLM Response Error: ValueError: Invalid response from LLM call - None or empty] | 0 | 132 | July 15, 2025 | |
| After upgrading to latest version of crewAI the code doesn't work anymore | 2 | 916 | September 18, 2024 |