< ![endif]–>
A (also called a crawler or a bot [robot]) is a program that searches the Internet and locates new public resources. These resources can include web pages, documents in PDF and other formats, pictures, videos, and other types of files.
Spiders report their finds to an Internet database (called indexing).
Each search engine has its own crawler program that crawls the Internet for new material to list on the search engine. They also have their own algorithm, a formula which helps determine how ‘relevant’ (close to what the person is looking for) the web page is.
Spider technology is necessary because the amount of information being added to the Internet on a daily basis is more than any human team can index.
Search engine results pages are delivered up in response to a query based on what pages have been crawled, and how relevant they are deemed to the topic at hand.
With SEO, you are trying to get your page crawled and deemed relevant. There are a number of different strategies to do both.
If your site is frequently crawled by Google, chances are much more likely that your pages are going to be found.