WebSite Auditor is equipped with a powerful SEO spider that crawls your site just like search engine bots do.
Crawler settings can be tailored to your needs and preferences: for instance, to collect or exclude certain sections of a site, crawl a site on behalf of any search engine bot, find pages that are unlinked from site, etc. To configure the settings, simply tick Enable expert options box once you are creating a project.
In Step 2 you’ll be able to specify crawler settings for the current project.
Robots.txt Instructions section features the following settings:
In the Filtering section you can specify various filtering conditions WebSite Auditor will use while collecting pages and resources to your project.
The Speed section allows limit the number of requests to the website, to decrease the load on the server. This prevents slower sites (or sites with high security restrictions) from blocking the crawler.
In the URL Parameters section, you can specify whether the program should collect the dynamic pages.
Advanced Options section contains additional crawler settings such as:
After configuring Crawler Settings, hit Finish for the program to start crawling your site. The settings can be accessed and changed any time later under ‘Preferences > Crawler Settings’ in each project.