Problems using Ollama embeddings

I have created a tool for RAG in a text file with the following configuration:

from crewai_tools import (
    TXTSearchTool,      # This tool has problems
    ScrapeWebsiteTool,
    SeleniumScrapingTool
)

# ...

txt_tool = TXTSearchTool(
    txt='/home/pepo/working/MyFile.txt',
    config=dict(
        llm=dict(
            provider="ollama",
            config=dict(
                model="llama3.2",
                temperature=0.5,
                stream=False,
            ),
        ),
        embedder=dict(
            provider="ollama",
            config=dict(
                model="nomic-embed-text",
                # task_type="retrieval_document",
                # title="Embeddings",
            ),
        ),
    )
)

# ...

    @agent
    def my_agent(self) -> Agent:
        return Agent(
                config=self.agents_config['my_agent'],
                verbose=True,
                tools=[txt_tool]
            )

I run the application

crewai run

But I get an error telling me that the libraries are incomplete:

Running the Crew
warning: `VIRTUAL_ENV=/home/pepo/working/crewaidev/venv` does not match the project environment path `.venv` and will be ignored
Traceback (most recent call last):
  File "/home/pepo/working/crewaidev/develaid/.venv/bin/run_crew", line 5, in <module>
    from develaid.main import run
  File "/home/pepo/working/crewaidev/develaid/src/develaid/main.py", line 20, in <module>
    from develaid.crew import Develaid
  File "/home/pepo/working/crewaidev/develaid/src/develaid/crew.py", line 24, in <module>
    txt_tool = TXTSearchTool(
               ^^^^^^^^^^^^^^
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/crewai_tools/tools/txt_search_tool/txt_search_tool.py", line 32, in __init__
    super().__init__(**kwargs)
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/pydantic/main.py", line 214, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/crewai_tools/tools/rag/rag_tool.py", line 46, in _set_default_adapter
    app = App.from_config(config=self.config) if self.config else App()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/embedchain/app.py", line 393, in from_config
    llm = LlmFactory.create(llm_provider, llm_config_data.get("config", {}))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/embedchain/factory.py", line 44, in create
    llm_class = load_class(class_type)
                ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/embedchain/factory.py", line 6, in load_class
    module = importlib.import_module(module_path)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pepo/working/crewaidev/develaid/.venv/lib/python3.12/site-packages/embedchain/llm/ollama.py", line 13, in <module>
    raise ImportError("Ollama requires extra dependencies. Install with `pip install ollama`") from None
ImportError: Ollama requires extra dependencies. Install with `pip install ollama`
An error occurred while running the crew: Command '['uv', 'run', 'run_crew']' returned non-zero exit status 1.

How can I fix this issue? or How do I install (internally) the library since Ollama is installed in the environment?

I’m using a venv environment, I entered the environment and placed the following command:

crewai update

Then I installed the package:

uv add ollama

And voila, I’m apparently using the “embeddings”

2 Likes