Custom knowledge source embedding failing for local Ollama model

Hi,
I have a custom knowledge source that I have created in my application. On creating the crew, like this:

crew = Crew(
      agents=[json_analyst, summarizer_agent],
      tasks=[ analysis_task, summary_task],
      verbose=True,
      knowledge_sources=[knowledge_source],
      embedder={
        “provider”: “ollama”,
        “config”: {
        “model”: “nomic-embed-text”
        }
      },
      process=Process.sequential
)

the script fails while creating the embedding with the following error:

"Failed to upsert documents: Expected Embedings to be non-empty list or numpy array, got  in upsert."

My llm for agent is configured like this:

llm =  ChatOpenAI(
    model="ollama/phi3:3.8b",
    base_url="http://localhost:11434/v1"
)

Any pointers around this? How to get embedding working with Ollama models?

Can you verify that the Ollama server is running on your system and accessible? If you run this http://localhost:11434/ on your browser, it should look like this.

Screenshot 2025-01-07 at 2.12.20 AM

Also check if you’re using the current version of CrewAI (v0.95.0)
Another side note tid bit, try using the LLM class instead of ChatOpenAI like so:

llm = LLM(
    model="ollama/llama3:70b",
    base_url="http://localhost:11434"
)

Yes, my Ollama server is running and I confirmed that. I am also running 0.95.0. Am I required to also use embedder_config in my knowledge source in addition to passing that information to the crew?

Issue seems to be resolved when using the previous version instead (0.86.0)