I have a docker image as follows:
FROM python:3.12-slim
WORKDIR /usr/src/app
COPY ./requirements.txt ./
RUN pip install --no-cache-dir --upgrade -r ./requirements.txt
COPY ./app ./
ENV ENVIRONMENT prod
ENV PORT 3000
EXPOSE 3000
CMD ["fastapi", "run", "main.py", "--port", "3000"]
And my requirements.txt:
fastapi==0.111.1
langchain==0.3.7
langchain-community==0.3.7
langchain-core==0.3.19
crewai==0.83.0
crewai-tools==0.14.0
And my python code that starts an CrewAI Agent and pass the SeleniumScrapingTool as a tool to him:
from crewai import Agent, Task, Crew, Process
from crewai_tools import SeleniumScrapingTool
chrome_options = {
'args': [
'--no-sandbox',
'--headless',
'--disable-dev-shm-usage',
'--disable-gpu',
'--disable-setuid-sandbox',
'--disable-software-rasterizer',
'--disable-dbus',
'--disable-notifications',
'--disable-extensions',
'--disable-infobars'
],
'service_args': ['--verbose'], # For debugging
'experimental_options': {
'excludeSwitches': ['enable-automation'],
'prefs': {
'profile.default_content_setting_values': {
'cookies': 1,
'images': 2, # Don't load images for better performance
'plugins': 2,
'popups': 2,
'geolocation': 2,
'notifications': 2
}
}
}
}
tool = SeleniumScrapingTool(website_url='URL', chrome_options=chrome_options)
Agent(
role=ROLE,
goal=GOAL,
backstory=BACKSTORY,
tools=tool
)
But somehow when i start my docker container it gives the following error trying to use the SeleniumScrapingTool:
2024-11-23 17:23:10 I encountered an error while trying to use the tool. This was the error: Message: Service /root/.cache/selenium/chromedriver/linux64/131.0.6778.85/chromedriver unexpectedly exited. Status code was: 127
But when i try to run at my local environment (windows 11) it runs perfectly. Only insides the docker image the error occurs. Anyone has any thoughts?