This repository was archived by the owner on Dec 28, 2023. It is now read-only.
  
  
  
  
  
Description
Hi guys,
I recently faced the case when several retry requests for fetch.crawlera.com helped the spider to work well. As I got from the discussion here https://zytegroup.slack.com/archives/C014HA686ES/p1612975265044000 uncork does 3 retries but not for all failures.
I've implemented this as a temporary fix with retrying requests right in the spider. We could do this customer retry middleware sure, but we will need to add this to every spider/project.
To make things simpler - is it possible to add this right into the CrawleraFetchMiddleware and add meta parameter for retry reasons/retry times along with the existing "skip" parameter?
The reasons for failed responses that I've faced
"crawlera_status":"fail"
"crawlera_status":"ban"
"crawlera_error":"timeout"
Thanks.