|
This site is for sale,
Learn More
Search Engine Spider (Web Crawler)
A
Search engine spider
(also known as
web crawler or
web spider ) is a program which browses the
World Wide Web
in a methodical, automated manner. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a
search engine,
that will index the downloaded pages to provide fast searches.
A web crawler is one type of
bot
, or software agent. In general, it starts with a list of
URLs
to visit. As it visits these URLs, it identifies all the
hyperlinks
in the page and adds them to the list of URLs to visit, recursively browsing the Web according to a set of policies.
Retrieved from http://en.wikipedia.org/wiki/Web_crawler
Reprinted from Wikipedia, The Free-Content Encyclopedia under the GNU Free Documentation License.
Site Promotion Articles Indexes:
|
|