I am currently working a build an application using Hybrid approach after the GenAI application output I want to pass the out of text to CrewAI, also in crewai I want to use my own trained model which I trained my own business terms to summarize the text.. I have download a t5-small from transformers and trained this model and stored in my local c:\crewaiproject but when I refer this model as llm llm_model = ‘t5-finetuned-billsum’ it is saying unable to find the model. It is working using ollamaLLM option but not sure the same is not working with my local model.
MODEL_CHECKPOINT = "t5-small"
OUTPUT_DIR = "models/t5-finetuned-billsum"
CrewAI team please confirm that crewai cannot use local LLM's (which is placed in the local folder of the project), as I understand the only option is to use Ollama.
My folder structure is as below.
c:\crewai\TextSummarization.py
textsummarization/
├── .env
├── knowledge/
├── models/
│ ├── t5-finetuned-billsum
│
├── src/
│ └── textsummarization/
│ ├── __init__.py
│ ├── main.py
├── crew.py
│ └── config/
│ ├── agents.yaml
│ └── tasks.yaml
|── outputs
└── pyproject.toml