SEOSEO News

Google Reading Javascript | Rank Ranger


For many years webmasters have struggled with the issue of using Javascript as it could affect their SEO efforts and position in SERPs. The many benefits of using Javascript and similar languages needed to be weighed against having content on your site that Google could not read and therefore could not boost your SEO. Now Google has announced that this is no longer the case as, after a lot of hard work, they are now able to analyze Javascript and read a page better and more closely, like a real browser, and include this information in their algorithm.

One reason that some webmasters use JavaScript (Ajax or similar techniques) is to speed up their websites. By hiding certain content in the initial loading, the website will load faster and then fill in the extra content as required. For example Facebook, in the newsfeed not all posts are displayed immediately, but rather as you scroll down more posts appear as required. 

This advancement  makes Google’s crawling more accurate as they can now see all the content on a website, as well it prevents sites form trying to trick them by displaying one type of content in the html and a different kind in “hidden” code. 

Google wants to make sure that everyone is able to benefit from these new advancements and have given some advice on dealing with potential issues:

“Sometimes things don’t go perfectly during rendering, which may negatively impact search results for your site. Here are a few potential issues, and – where possible, – how you can help prevent them from occurring:

  • If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user. We recommend allowing Googlebot to retrieve JavaScript and CSS so that  your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile
  • If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.
  • It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.
  • Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.
  • Some JavaScript removes content from the page rather than adding, which prevents us from indexing the content.”

Overall this is a major advancement in Google’s rendering process and hopefully will encourage more webmasters to create innovative and exciting websites using a variety of styles and content types.

About The Author

Shiri Berzack

Shiri brings her knowledge in online marketing and social media, among other things to the Rank Ranger team. She’s excited to be a Ranger and looks forward to wearing the ranger hat and getting her sheriff’s badge in the SEO force.





Source link

Related Articles

Back to top button
Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!