Use the Site Auditor settings section to modify the parameters of your website scanning by our crawler.
The parameters you can set manually:
Crawl delay in seconds - frequency of your website scanning. You can set a delay in seconds for our crawler. By default, the delay is 6-8 seconds.
Depth limit - the parameter sets the allowable depth of scanning. Level 1 it is all links from the main page; level 2 allows to scan all pages which have links from the first level and so on. By default, the depth is unlimited.
Custom Robots.txt - the parameter is added to facilitate you controlling of our crawler. You can allow or restrict a website scanning right in the Site Auditor settings section without setting up separate rules in your robots.txt file. Simply uncomment the necessary string in the area on the right.
Robots Merge Mode - sets the behaviour of our crawler towards your robots.txt file. By Override mode our crawler ignores your robots.txt file.
By Merge mode our crawler follows both rules of your robots.txt file and the rules set up in the Custom Robots.txt area. In case if some of the rules are in conflict, the highest priority will have those rules which were set up in Custom Robots.txt area.
Also, right here you can manage the Spell Checking settings. To disable the Spell Checking mode simply remove the tick from the appropriate field. You can enable it anytime, just tick the field again. The Spelling Exceptions area displays the words that you've set as exceptions (Site Auditor won't consider them as the mistakes anymore). Plus, you can add new exception words in the area, each for a new line or delete any of the existing ones. The changes will be considered while the next scanning. Please note that all the set exceptions are applied only for the current project, not for the whole account.
When you finish don't forget to click the "Save" button at the bottom of the screen.