Page 1 of 1

Choose your crawl source in the settings

Posted: Mon Jan 06, 2025 4:43 am
by sharminakter
Set the "audited pages limit" to audit in settings.
The more pages you crawl, the better visibility you get.

You also have the option to choose your crawl source.

Note that the " Website " option allows robots to crawl your site like Google does, while a crawl using " Sitemaps on site " uses the sitemap URLs in the robots.txt file.

You can also click on " URLs from file " to select the pages you want the robots to crawl.

Adjust track settings
Crawler settings allow you to choose the type of robot that will crawl your website.

Site Audit Crawler Settings Page
You can choose between Googlebot and SemrushBot and norway phone data between mobile and desktop versions.

Then select your timeout settings. Bots crawl at normal speed with a "minimum delay" and prioritize user experience in the "1 URL per 2 seconds" settings.

Finally, select "Respect robots.txt" if you have the corresponding file and need a specific crawl delay.

Allow/deny URLs
Customize your site audit with URL allow/disallow settings by entering URLs in the corresponding boxes.

Customize your site audit with allowed/disallowed URL parameters.
Remove URL parameters
Removing URL parameters helps bots avoid wasting crawl budget by crawling the same page twice.

"List of URL parameters to ignore when crawling" configuration page.
Bypass Website Restrictions
If your website is still under development, use this setting to perform an audit.