What is a Spider?

< ![endif]–>

A (also called a crawler or a bot [robot]) is a program that searches the Internet and locates new public resources. These resources can include web pages, documents in PDF and other formats, pictures, videos, and other types of files.

Spiders report their finds to an Internet database (called indexing).

Each search engine has its own crawler program that crawls the Internet for new material to list on the search engine. They also have their own algorithm, a formula which helps determine how ‘relevant’ (close to what the person is looking for) the web page is.

Spider technology is necessary because the amount of information being added to the Internet on a daily basis is more than any human team can index.

Search engine results pages are delivered up in response to a query based on what pages have been crawled, and how relevant they are deemed to the topic at hand.

With SEO, you are trying to get your page crawled and deemed relevant. There are a number of different strategies to do both.

If your site is frequently crawled by Google, chances are much more likely that your pages are going to be found.

Share

Author: jm

Joan Mullally has been doing business online for more than 20 years and is a pioneer in the fields of online publishing, marketing, and ecommerce. She is the author of more than 200 guides and courses designed to help beginner and intermediate marketers make the most of the opportunities the Internet offers for running a successful business. A student and later teacher trainee of Frank McCourt’s, she has always appreciated the power of the word, and has used her knowledge for successful SEO and PPC campaigns, and powerful marketing copy. One computer science class at NYU was enough to spark her fascination with all things digital. In her spare time, she works with adult literacy, animal fostering and rescue, and teaching computer skills to women.