Search engines use automated software programs that crawl the web. These programs called "crawlers" or "spiders" go from link to link and store the text and the keywords from the pages in a database. "Googlebot" is the name of Google's spider software.
Many webmasters have noticed that there are now two different Google spiders that index their web pages. At least one of them is performing a complete site scan: