Also while looking at the logs that bitmap provided I see a lot of requests coming from:

      1. Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp
      2. bingbot/2.0; +http://www.bing.com/bingbot.htm

      Give them 1 request/sec while we sort our shit out.

          [MBH-461] Add rate limits for crawlers

          Ulrich Klauer added a comment -

          Uh, lower values, of course – it’s a delay, not a number of allowed requests per second. So probably easiest to just leave it out for other agents.

          Ulrich Klauer added a comment - Uh, lower values, of course – it’s a delay, not a number of allowed requests per second. So probably easiest to just leave it out for other agents.

          Ulrich Klauer added a comment -

          Both crawlers apparently support the Crawl-delay directive in robots.txt, so it might make sense to add

          Crawl-delay: 1
          

          for the default agent (and higher values in specific sections for the IA and Googlebot).

          Ulrich Klauer added a comment - Both crawlers apparently support the Crawl-delay directive in robots.txt , so it might make sense to add Crawl-delay: 1 for the default agent (and higher values in specific sections for the IA and Googlebot).

            zas Zas
            rob Robert Kaye
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:

                Version Package