Wednesday, August 12, 2009

Pingie: HTTP:: Bing Crawler MSNBot Crawl Delay Parameters

HTTP:: Bing Crawler MSNBot Crawl Delay Parameters
--------
Crawl delay is one of the options that webmasters have at their disposal in order to control the search engine bots that are indexing their websites. In order to prevent web server load issues, website owners can delay crawling frequency. This move is necessary in scenarios in which the indexing process of large websites delivers a palpable impact on the server resources available. The Bing crawler, which the search engine inherited from its precursor, Live Search, namely MSNBot, can be served crawl delay parameters via the robots.txt file, explained Rick DeJarnette, Bing Webmaster Center. “Bing supports the directives of the Robots Exclusion Protocol (REP) as listed in a site’s robots.txt file, which is stored at the root folder of a website. The robots.txt file is the only valid place to set a crawl-delay directive for MSNBot,” DeJarnette added. “The robots.txt file can be configured to employ directives set for specific bots and/or a generic directive for all R!
EP-compliant bots. Bing recommends that any crawl-delay directive be made in the generic directive section for all bots to minimize the chance of code mistakes that can affect how a site is indexed by a particular search engine.” Here is an example of how a webmaster can set the crawl delay parameter. Simply enter &ld...
--------
http://news.softpedia.com/news/Bing-Crawler-MSNBot-Crawl-Delay-Parameters-119023.shtml


This e-mail was sent by Experiment23 Inc., located in New York, NY
10163. To not receive further e-mails, please visit
http://help.pingie.com

No comments:

Post a Comment