A spider is a program used by search engines to index websites. It is also called a crawler or search engine robot. The main usage of search engine spiders is to index sites. It visits the websites and read their pages and creates entries for the search engine index. They act as data searching tools that visit websites to find new or updated content or pages and links.
Spiders do not monitor sites for unethical activities. They are called spiders as they visit many sites simultaneously, i.e., they keep spanning a large area of the web. All search engines use spiders to build and update their indexes.