These are basically agents of search engines sent out in order to better go through and read web content on various websites, also called ‘crawling’ the web. Also known as spiders, these crawlers gather listings and follow links. It expands a search engine’s index by making copies of the pages it goes through.