Even the best tools can run in to problems when crawling large sites. Vivek has a great tip on how you can overcome memory issues when using Screaming Frog to crawl a large site.

“By default Screaming Frog allocates 512mb of memory (RAM) for its processes. But the memory can quickly be used up when crawling large and complicated sites, resulting in error/warning messages and not being able to continue with the crawl.

If you encounter a significant slow-down, or a ‘high memory usage’ warning, you should save your crawl via the File menu and Save option. You can then increase your memory allocation and then resume the crawl.

– Navigate to the Screaming Frog installation folder – by default this will be under C:Program Files/Screaming Frog SEO Spider
– Locate the file called ScreamingFrogSEOSpider.l4j.ini
– Open this file with Notepad
– Locate the line -Xmx512M
– The ‘512’ part denotes the memory allocation. Amend this to the RAM value you wish to allocate, for example –Xmx1024M for 1GB of memory allocation. If you input an allocation higher than what you have on your device, the SEO spider won’t start.

If you’re using OS X on a Mac, you can do the same thing by simply opening Terminal (via Applications > Utilities) and entering: defaults write uk.co.screamingfrog.seo.spider Memory 1g

The ‘1g’ refers to 1GB of memory allocated. You can also use megabytes by replace the ‘1g’ with 1024m

If you’re running a 32-bit Windows machine or an older version of Mac OS, you can find further instructions from Screaming Frog.“

[tips guest=”vivek”]