Hi,
I have a custom knowledge source that I have created in my application. On creating the crew, like this:
crew = Crew(
agents=[json_analyst, summarizer_agent],
tasks=[ analysis_task, summary_task],
verbose=True,
knowledge_sources=[knowledge_source],
embedder={
“provider”: “ollama”,
“config”: {
“model”: “nomic-embed-text”
}
},
process=Process.sequential
)
the script fails while creating the embedding with the following error:
"Failed to upsert documents: Expected Embedings to be non-empty list or numpy array, got in upsert."
My llm for agent is configured like this:
llm = ChatOpenAI(
model="ollama/phi3:3.8b",
base_url="http://localhost:11434/v1"
)
Any pointers around this? How to get embedding working with Ollama models?