I was working on a data analyst agent, and I want to use a custom LLM. I have a model hosted on a server with a URL, and I need to use the LLM via that URL. I tried creating a custom class, but it did not work.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Using Custom LLM Request | 3 | 420 | January 26, 2025 | |
How to use "function_calling_llm" for agent | 3 | 275 | November 30, 2024 | |
Get LLM Studio to use Ollama Models | 1 | 234 | September 22, 2024 | |
How to embed knowledge source with ollama2 | 9 | 433 | January 25, 2025 | |
Is it possible to use in an agent a Hugging Face model downloaded to a local folder and call it with the Hugging Face transformers library? | 12 | 124 | December 19, 2024 |