Glossary: Crawl

This is what a search engine does when it reads an index of a website, usually with a small program called a spider or a bot. Spiders (or Bots) are the software that the search engine uses to crawl your site with.

This “spider” “crawls” over your “web” site (are you seeing the theme here?) from one page link to the next and indexes (or scans/catalogs) what pages/content are in your web site. The search engine algorithm than looks at those pages and ranks them for specific keywords. If your web site is optimized for these “crawls”, it is tuned up to score better when the search engines rank your pages. The higher your pages rank, the more traffic you will get  to your web site.