Can't use Gemini

I’m trying to use a Gemini API key like this :

llm_gemini = LLM(
model=“gemini-1.5-pro”,
temperature=0.7,
api_key=gemini_api_key
)

but I always get this error while executing the crew :

ERROR:root:LiteLLM call failed: litellm.APIConnectionError: Your default credentials were not found. To set up Application Default Credentials, see Set up Application Default Credentials  |  Authentication  |  Google Cloud for more information.
Traceback (most recent call last):
File “C:\Users\tomit\crewai_env\Lib\site-packages\litellm\main.py”, line 2372, in completion
model_response = vertex_chat_completion.completion( # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\tomit\crewai_env\Lib\site-packages\litellm\llms\vertex_ai\gemini\vertex_and_google_ai_studio_gemini.py”, line 1204, in completion
_auth_header, vertex_project = self._ensure_access_token(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\tomit\crewai_env\Lib\site-packages\litellm\llms\vertex_ai\vertex_llm_base.py”, line 130, in _ensure_access_token
self._credentials, cred_project_id = self.load_auth(
^^^^^^^^^^^^^^^
File “C:\Users\tomit\crewai_env\Lib\site-packages\litellm\llms\vertex_ai\vertex_llm_base.py”, line 84, in load_auth
creds, creds_project_id = google_auth.default(
^^^^^^^^^^^^^^^^^^^^
File “C:\Users\tomit\crewai_env\Lib\site-packages\google\auth_default.py”, line 719, in default
raise exceptions.DefaultCredentialsError(_CLOUD_SDK_MISSING_CREDENTIALS)
google.auth.exceptions.DefaultCredentialsError: Your default credentials were not found. To set up Application Default Credentials, see Set up Application Default Credentials  |  Authentication  |  Google Cloud for more information.

I tried to bypass the use of of LiteLLM with langchain but I didn’t succeed.
Any help or suggestion would be appreciated.

Following the documentation, you should:

  1. Get credentials from your Google Cloud Console and save it to a JSON file, for example path/to/vertex_ai_service_account.json.
  2. With the credentials saved, run the code to verify if everything worked correctly:
from crewai import LLM
import os
import json

os.environ["GEMINI_API_KEY"] = "your-gemini-api-key"
credentials_path = "path/to/vertex_ai_service_account.json"

with open(credentials_path, "r") as file:
    vertex_credentials = json.load(file)

vertex_credentials_json = json.dumps(vertex_credentials)

gemini_llm = LLM(
    model="gemini/gemini-2.0-flash",
    temperature=0.7,
    vertex_credentials=vertex_credentials_json
)

gemini_response = gemini_llm.call(
    "Hey, who are you?"
)

print(f"\n🤖 Gemini Response:\n\n{gemini_response}\n")

Thanks for the response but the point is I don’t have an account, only the key so it wouldn’t work.
How I fixed the issue I think was setting the model like this model=“gemini/gemini-1.5-pro-latest”.
It’s what you suggested too so it seems It was just a syntax error.
Thanks again.