You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using Python and Selenium on AWS Lambdas for crawling. I have updated Python to 3.11 and Selenium to 4.18.0, but then my crawlers stopped working. This is the code for Selenium:
import os
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.chrome.service import Service
def get_headless_driver():
options = Options()
service = Service(executable_path=r'/opt/chromedriver')
options.binary_location = '/opt/headless-chromium'
options.add_argument('--headless')
options.add_argument('--no-sandbox')
options.add_argument('--single-process')
options.add_argument('--disable-dev-shm-usage')
options.add_argument('--window-size=1920x1080')
options.add_argument('--start-maximized')
return webdriver.Chrome(service=service, options=options)
def get_selenium_driver():
return get_local_driver() if os.environ.get('STAGE') == 'local' else get_headless_driver()
I am using Python and Selenium on AWS Lambdas for crawling. I have updated Python to 3.11 and Selenium to 4.18.0, but then my crawlers stopped working. This is the code for Selenium:
This is the code for installing chrome driver:
I am getting this error:
Message: Service /opt/chromedriver unexpectedly exited. Status code was: 127
How should I fix this error? Should I also update the chromedriver and headless? What versions should I chose?
The text was updated successfully, but these errors were encountered: