If so, you can use our Site Audit tool to uncover more issues than before.
Here’s how we were able to do that.
What’s New in Site Audit?
To show you the difference, take a look at a before and after:
Here’s one audit with JS rendering enabled and another audit (of the same site) with JS disabled.
See the big difference between the number of errors and warnings before and after? With JS enabled, it can properly identify the issues that need to be fixed.
How those files turn from lines of code to an interactive website on your browser can happen in a few ways (like client-side vs. server-side rendering). Each has its pros and cons. However, sites that use JS and client-side rendering might run into crawling problems, especially for tech audit bots.
Why does that happen? In short, it comes down to resources.
Google has the resources to load both static HTML and injected HTML once a site’s JS is executed. But some site auditing bots don’t have the resources to handle that task (like ours before this update).
Here’s what Airbnb would look like to a bot that can’t render JS.
Here’s what that looks like:
Why Should You Enable JS Crawling in a Site Audit?
You might not see the full picture of your site’s issues without enabling JS in your next site crawl.
Enabling JS could help you find relevant issues that were potentially missed before.
Especially for sites built on the app shell model (where core components are loaded, but usually not the bulk of a site’s content—think of the earlier Airbnb example).
1. We receive the initial HTML from the web server during crawling
2. We load the JS resources that are linked in this HTML
3. Execute and render JS code
4. Wait 5 seconds
5. Site Audit uses the final HTML for further analysis steps
When you choose to disable JS in Site Audit, we use the original HTML as we did before.
Does Enabling JS Affect My Limits?
No, it will not affect your limits if you choose to enable JS on Site Audit. However, only those with Guru or Business subscriptions can use this feature.
Will It Trigger Trackers, Ads, and Event Handlers?
No, Site Audit will not trigger trackers, ads, and event handlers (i.e. JS triggered by clicking or scrolls). Here’s what we block when our Site Audit bot crawls a site:
- Yandex Metrica
- Adobe Analytics
- Google Ads
Does Site Audit Use Chrome for Rendering JS?
Yes, we use the latest Chromium rendering engine. So, we use the same technology as Google does for crawling.
How to Crawl Your Site’s JS with Site Audit
Crawling your site’s JS with Site Audit is simple. First, go to the Site Audit tool and click the ‘create project’ button.
Next, it will prompt you to set up the audit. The only difference is that you’ll choose ‘enabled’ for ‘JS-rendering’ in the crawler settings. You can still leave it disabled if you’d like or change it later.
You can check if your audit is rendering your site’s JS by looking up here in the report:
How to Crawl an Existing Project’s JS with Site Audit
If JS rendering isn’t enabled on a current project, you can change that in the Site Audit settings.
From here, just click the “re-run campaign” button to crawl your site with JS enabled. You can disable JS rendering in the same way if you want to change it back later.
Find All Of Your Site’s Issues with JS Rendering
Similarly, many websites use the app shell model (as well as client-side rendering) to benefit both the user agent and the site owner in their own ways. Your site may use JS to achieve the same results. And if your site does, we can identify what issues Google may run into with Site Audit’s new JS rendering feature.
Source link : Semrush.com