🤔 Flows - CrewDoclingSource Markdown Knowledge Source Error, Ollama Local LLM Environment

Hi!
I would greatly appreciate any insights on this issue. :blush:
I’m attempting to utilize a markdown.md file as a knowledge source within a crew that is currently in a flow.
The entire environment is local, and I’m working with Ollama.
The knowledge file is located in a folder named [knowledge], which resides in the same directory as main.py.

:fire: Knowledge integration works seamlessly in a single .py crew, in that I can also successfully use a .md file.

Here is the Error:

[ERROR]: Failed to upsert documents: APIStatusError.__init__() missing 2 required keyword-only arguments: 'response' and 'body'
[WARNING]: Failed to init knowledge: APIStatusError.__init__() missing 2 required keyword-only arguments: 'response' and 'body'

Versions:
Crewai: 0.100.1
crewai-tools: 0.33.0
Python 3.12.8
Using Ollama only, local llms i have tried. deepseekR1:7b, llama3.2:3b. others
Different temps.

Here is my code:

# /crews/poem_crew/poemcrew.py
from crewai.knowledge.source.crew_docling_source import CrewDoclingSource
knowledge_source = CrewDoclingSource(
    file_paths=["knowledge.md"]
)

And

# /crews/poem_crew/poemcrew.py
@CrewBase
class PoemCrew:
    """Poem Crew"""

    agents_config = "config/agents.yaml"
    tasks_config = "config/tasks.yaml"
    llm = LLM(model="ollama/deepseek-r1:7b", temperature=0.7)

    @agent
    def poem_writer(self) -> Agent:
        return Agent(
            config=self.agents_config["poem_writer"],
            # memory=True,
            llm=self.llm,
            # LLM=LLM(model="ollama/llama3-70b-8192", temperature=0.3),
        )

    @task
    def write_poem(self) -> Task:
        return Task(
            config=self.tasks_config["write_poem"],
        )

    @crew
    def crew(self) -> Crew:

        return Crew(
            agents=self.agents,
            tasks=self.tasks,
            process=Process.sequential,
            # memory=True,
            verbose=True,
            knowledge_sources=[knowledge_source],
            embedder={
                "provider": "ollama",
                "config": {
                    "model": "mxbai-embed-large"
                }
            }
        )

I have also tried the following:

# /crews/poem_crew/poemcrew.py
from crewai.knowledge.source.crew_docling_source import CrewDoclingSource
knowledge_source = CrewDoclingSource(
    file_paths=["knowledge.md"],
    chunk_size=4000,     # Characters per chunk (default)
    chunk_overlap=200,  # Overlap between chunks (default)
)

and

# /crews/poem_crew/poemcrew.py
from crewai.knowledge.source.crew_docling_source import CrewDoclingSource
from crewai.knowledge.storage.knowledge_storage import KnowledgeStorage
knowledge_source = CrewDoclingSource(
    file_paths=["knowledge.md"],
    storage=KnowledgeStorage(
        embedder_config={
            "provider": "ollama",
            "model": "nomic-embed-text",
            "base_url": "http://localhost:11434"
        }
    )
)
  • The above gives a slightly Different Error:
    Failed to upsert documents: APIStatusError.init() missing 2 required keyword-only arguments: ‘response’ and ‘body’

tried updating:

pip Install --upgrade crewai crewai-tools transformers tokenizers docling docling-core
from crewai.knowledge.source.crew_docling_source import CrewDoclingSource
from crewai.knowledge.source.text_file_knowledge_source import TextFileKnowledgeSource

:point_right:t2: Anyone have any examples of a fow: crew.py with Knowledge, CrewDoclingSource and markdown.md with just ollama?

** Is this a flow problem?

Possible Solution - None, I think the sub library just needs to be fixed