SEOSEO News

Googlebot Wants Your Javascript, CSS and Image Files


Blocking Googlebot from your page may have negative effects on your SEO. Emerging from the old text browser world, Google has announced an update to Googlebot that renders pages as they’re viewed in modern browsers so they’re now instructing that “you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use.”


Google Webmaster Guidelines now indicate that “disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings”. With this new update to the search algorithm, Googlebot is able to understand Javascript, CSS and image files, allowing them to have a positive influence on your SEO. If the Googlebot is blocked from seeing these functions on your website, it may view that as empty or wasted space, which could be interpreted as useless to your users, resulting in lower rankings. 

This announcement is related to one in May, which spoke about Google now rendering pages more like a modern browser and less like old browsers that were only able to view text. You are able to see how Google fetches your page with the Fetch and Render tool that was released late May.

Google’s Tips:

  • Keep in mind that not all functionality that is used on websites today is supported when crawling a site, therefore it is a good idea to adhere to web design principles of “progressive enhancement” to make sure that Googlebot can see all content.
  • Once again, the importance of the speed of your page is emphasized. The quicker your page renders for users, the more efficiently the page will be able to be indexed.
  • Update your server if needed to enable it to “handle the additional load for serving of JavaScript and CSS files to Googlebot.”

If you would like to confirm that your page is adhering to Google guidelines as well as being optimized for your keywords, take a look at the updated On-Page Keyword Optimization report in the Rank Ranger platform. Use this report to analyze landing pages, it provides a SERP preview and editor, suggestions for improvements and a score based on how well your website is currently meeting best practice with respect to ranking for a particular keyword. 

Is your site currently allowing Googlebot to crawl all of your data?  

If you’re a web hosting provider, how might this change affect server load – is it something for customers to be concerned about?

About The Author

Rank Ranger

Rank Ranger is an SEO Platform designed to standardize management and reporting for the digital marketing world by filling the need for a comprehensive online marketing platform capable of tracking & monitoring campaign data, integrated with 3rd party software and services, providing fully personalized and customized reporting, 100% white label automated reports and a branded web interface.



Source link

Related Articles

Back to top button
Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!