Skip to content

Commit

Permalink
NUTCH-3011 HttpRobotRulesParser: handle HTTP 429 Too Many Requests sa…
Browse files Browse the repository at this point in the history
…me as server errors (HTTP 5xx)
  • Loading branch information
sebastian-nagel committed Oct 21, 2023
1 parent ecdd19d commit b081c75
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 6 deletions.
11 changes: 6 additions & 5 deletions conf/nutch-default.xml
Original file line number Diff line number Diff line change
Expand Up @@ -141,25 +141,26 @@
<name>http.robots.503.defer.visits</name>
<value>true</value>
<description>Temporarily suspend fetching from a host if the
robots.txt response is HTTP 503 or any other 5xx server error. See
also http.robots.503.defer.visits.delay and
robots.txt response is HTTP 503 or any other 5xx server error
and HTTP 429 Too Many Requests. See also
http.robots.503.defer.visits.delay and
http.robots.503.defer.visits.retries</description>
</property>

<property>
<name>http.robots.503.defer.visits.delay</name>
<value>300000</value>
<description>Time in milliseconds to suspend crawling a host if the
robots.txt response is HTTP 5xx - see
robots.txt response is HTTP 5xx or 429 Too Many Requests - see
http.robots.503.defer.visits.</description>
</property>

<property>
<name>http.robots.503.defer.visits.retries</name>
<value>3</value>
<description>Number of retries crawling a host if the robots.txt
response is HTTP 5xx - see http.robots.503.defer.visits. After n
retries the host queue is dropped for this segment/cycle.
response is HTTP 5xx or 429 - see http.robots.503.defer.visits.
After n retries the host queue is dropped for this segment/cycle.
</description>
</property>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -229,7 +229,8 @@ public BaseRobotRules getRobotRulesSet(Protocol http, URL url,
else if ((code == 403) && (!allowForbidden))
robotRules = FORBID_ALL_RULES; // use forbid all

else if (code >= 500) {
else if (code >= 500 || code == 429) {
// 5xx server errors or 429 Too Many Requests
cacheRule = false; // try again later to fetch robots.txt
if (deferVisits503) {
// signal fetcher to suspend crawling for this host
Expand Down

0 comments on commit b081c75

Please sign in to comment.