Webmaster level: All
We recently announced that our indexing system has been rendering web pages more like a typical modern browser, with CSS and JavaScript turned on. Today, we’re updating one of our technical Webmaster Guidelines in light of this announcement.
For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.
Updated advice for optimal indexing
Historically, Google indexing systems resembled old text-only browsers, such as Lynx, and that’s what our Webmaster Guidelines said. Now, with indexing based on page rendering, it’s no longer accurate to see our indexing systems as a text-only browser. Instead, a more accurate approximation is a modern web browser. With that new perspective, keep the following in mind:
- Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of progressive enhancement as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.
- Pages that render quickly not only help users get to your content easier, but make indexing of those pages more efficient too. We advise you follow the best practices for page performance optimization, specifically:
- Eliminate unnecessary downloads
- Optimize the serving of your CSS and JavaScript files by concatenating (merging) your separate CSS and JavaScript files, minifying the concatenated files, and configuring your web server to serve them compressed (usually gzip compression)
- Make sure your server can handle the additional load for serving of JavaScript and CSS files to Googlebot.
This is what the error looks like:
How do you prevent getting these messages and the resulting loss of ranking that will follow?
The logical solution for any company or website owner, is to have a maintenance agreement with a Web Design/SEO firm so that if any changes are made to Google and its ever powerful algorithm, a website can be pro-active instead of re-active. In most cases, by the time a company is aware of the Website problem, the damage has already been done and rankings along with traffic have dropped.
- A simple maintenance agreement would cover these errors, updating the software on servers, themes and plugins (if used). This not only reduces security vulnerabilities but maintains page load speeds and navigational performance. These “Insurance” packages can start at $30 per month.
- A more powerful Management Program will incorporate SEO efforts in addition to the web site monitoring. By reviewing the Google Analytics on a weekly/bi-weekly basis, website content and SEO can be “tweaked” to adjust to changing conditions/market and advertising campaigns. These SEO packages Start at $300 per month.
- The most common online management package usually incorporates Web Maintenance, Ongoing Web Site SEO, and Social Media Management. Not only improving the Website and Ranking Factors but building quality content into the Web Site using Blog Posts and Social Media. These packages start at $750 per month.
If you are unsure if you are connected with Google Webmaster Tools or have questions regarding a website maintenance package, don’t hesitate to contact us.