Pages

Monday, October 25, 2010

How to Reduce the Number of Requests the Search Crawler Makes on Your Site

How to Reduce the Number of Requests the Search Crawler Makes on Your Site

If you occasionally get high traffic from search crawlers, you can specify a crawl delay parameter in the robots.txt file to specify how often, in seconds, search crawlers can access your website. To do this, add the following syntax to your robots.txt file:
User-agent: *
Crawl-delay: 10
Individual crawler sections override the settings that are specified in * sections. If you've specified Disallow settings for all crawlers, you must add the Disallow settings to the search crawler section you create in the robots.txt file. For example, your robots.txt file might have the following:
User-agent: *
Disallow: /private/
If you add a specific search crawler section, you must add any Disallow settings to that section.
For example:
User-agent: *
Crawl-delay: 10
Disallow: /private/

0 comments:

Post a Comment

Click Older Posts To Next Pages