 | When you search using Google, you're actually searching through an index of web pages. To gather the raw material for the index, Google's web-crawling robot, called Googlebot, sends a requests to a web server for a web page. It then downloads the page. Googlebot runs on many computers simultaneously, and constantly requests and receives web pages, making thousands of requests per second. In fact, Googlebot makes requests more slowly than its full capability, because if it operated fullthrottle, it would overwhelm many web servers, and the servers would not be able to deliver pages quickly enough to users. |