If you’re about to launch a new site, or one an updated version, you’re going to want to crawl it and check it out for errors beforehand. Vivek has a top tip to add to your process/checklist.
“When you’re setting up a new site you don’t want the various search engines crawling it and reporting issues and errors when you’re still in development.
To avoid this you can amend robots.txt to instruct the search engines to not crawl parts or even all of the site. But the problem with this is that the very popular spider and crawling tool, Screaming Frog (SF), will also follow these instructions and not properly crawl your site and advise you of any potential issues.
To avoid this problem, and still get your SF diagnostics and feedback, you can amend a small setting. Within SF go to the Configuration option, Spider, and from there select the ‘Ignore robots.txt’ option. This will mean that robots.txt will remain in place and your instructions to the search engines will be followed, but also that SF will now fully crawl your site and report back on errors and issues that you can investigate and rectify before launch.”
If you’re about to launch a new site, or push some great new content and functions on an existing site, and you want some further top tips on things to do and to avoid to make your launch as smooth as possible, please get in touch and we’ll be more than happy to try and help you out!