Despite Google previously advising webmasters to not block JS or CSS from being crawled, some people still do so. Vivek is going to tell us why that’s a bad idea.

“Earlier in the year, Google sent out a notice from Google Search Console, advising that some users were blocking JS and CSS from being crawled, read, and therefore not being indexed correctly for search engine results.

While JS and CSS don’t contain information that is specifically indexed, it does enable the search engines to effectively render your page and understand what it looks like and what it’s offeringkey factors in how your page is scored and ranked by the search engines. Google explains that blocking these assets can lead to “suboptimal rankings”.

In order to rectify this, you’ll need to amend your robots.txt accordingly so that JS and CSS assets are included in any crawls. If you actually received the aforementioned notification from Google, it provides instructions and links on what to do:

–          Use the “Fetch as Google” feature to see how your site is presented to Google.

–          Update your robotos.txt file.

User-Agent: Googlebot

Allow: .js

Allow: .css

–          Repeat the first step to test if your changes have taken affect.”

pasted_image_at_2015_11_02_12_04_360

[tips guest=”vivek”]