Embedder error using vertexai embeddings

Hi I’m attempting to include memory in my agents using vertexai embedding models but keep running into validation error for the embedder parameter. I’ve try using a dictionary as well through google ai but run into a value error. Any tips to solve this?


1 Like

I have same error with this. I don’t know why but may be the pydantic version issue that pydantic version 2 need dictionary for the input data. I also try with dictionary format but error occurs.

  embedding_config={
				"provider": "vertexai",
				"config": {
					"api_key": "my api key",
					"model_name": "text-multilingual-embedding-002",
     				"region": "us-central1",
                    "project_id": "myprojectid",
				}
			}
  #for the crew 
  return Crew(
			agents=self.agents,
			tasks=self.tasks,
			process=Process.sequential,
			verbose=True,
			knowledge_sources=[self.date_source],
			memory=True,
			embedder=embedding_config,
		)

[2025-01-31 08:49:25][ERROR]: Failed to upsert documents: Expected Embedings to be non-empty list or numpy array, got in upsert.

[2025-01-31 08:49:25][WARNING]: Failed to init knowledge: Expected Embedings to be non-empty list or numpy array, got in upsert.
ERROR:root:Error during short_term search: Expected Embedings to be non-empty list or numpy array, got in query.
ERROR:root:Error during entities search: Expected Embedings to be non-empty list or numpy array, got in query

I’m not sure but the document for the vertex ai can be outdated.