Vivek is back with yet another top tip on how to manage your site to ensure it’s crawled correctly, effectively and efficiently. Read on to find out what he has to say…

“A large number of search engine crawl errors come from incompatible sitemaps and robots.txt.

To ensure you avoid any issues and errors you need to ensure that your sitemap and robots.txt work hand-in-hand and reflect each other’s requirements.

An important point to remember is that if you have instructed robots.txt to not crawl a certain part of the site, you also need to make sure that you do not submit that specific site URL to your sitemap. If you do, you’re likely going to get a conflict which will result in errors in your webmaster tools.

With this in mind, it’s also good to remember this process when you’re trying to troubleshoot webmaster tool error messages. If you do see errors relating to a crawl that’s taken place, tie back the URL with the sitemap to see whether it’s been submitted or not. If so, and you didn’t want it crawled, remove it accordingly and that will hopefully eliminate any further crawl errors for that part of the site.”

If you’d like further information and tips on how to set up and manage your site, how to crawl and test it, and get the best SEO possible, feel free to get in touch and we’ll be more than happy to try and help you out!

Email: info@quru-analytics.com or Phone: +358 503 478 635

[tips guest=”vivek”]