Site dependency on JavaScript a problem for Googlebot?

In a recent off-hours hangout at Google Search Central SEO, a question was sent to Google search attorney John Mueller asking if it was a bad idea for a website to rely on JavaScript for basic functionality.

Could this have a negative impact on Googlebot when it comes to crawling and indexing?

Mueller noted that it would probably be fine, but also suggested things that should be done to make sure there were no problems with Google and users alike.

The site is not easy to use without javascript

The person asking the question noted that a great deal of the site’s functionality is based on JavaScript and was concerned about affecting both usability and ease of SEO.

this is the question:

Our site is not very user friendly if JavaScript is turned off.

Most of the images are not uploaded. The popup cannot be opened.

However, Chrome Inspect feature, all the menu links are in the source code.

Is our reliance on JavaScript still a problem for Googlebot? “

What one means about “Chrome Inspect Feature” is probably Chrome’s built-in page source code inspection tool.

So what they mean is that even though the links are not accessible when JavaScript is turned off in the browser, the links are still in the HTML code.

Mueller recommends testing the site

Mueller’s answer acknowledged that perhaps Google could handle the location.

But what is left unspoken is the fact that the functionality of many sites is based on JavaScript and the experience of the person asking the question is pretty much natural.

Visit any site where JavaScript is turned off on the browser and many images won’t load, layout may crash and some menus won’t work.

Here is a screenshot of SearchEngineJournal as rendered with JavaScript disabled:

While Mueller alluded to this fact in his answer, it would probably be put in the lead of the answer that most sites are unfriendly to the user without JavaScript enabled in the browser and that the experience of the person asking the question is not out of the ordinary but is in fact very common.

Mueller admitted that everything would probably be fine.

He said:

“And from my point of view… I will test it.

So everything will probably be fine.

And I’d probably assume that if you’re using JavaScript in a reasonable way, and if you’re not doing anything special to block JavaScript on your pages, it should probably just work. “

Test to see how the site is performing

Mueller then encouraged the person to run tests to ensure that the site was working optimally and stated that “we have” tools but did not mention specific tools.

Presumably it’s talking about tools available on Google Search Console that can provide feedback on whether Google is able to crawl pages and images.

Mueller continued his answer:

“But you’re better off not just believing me, but using a test tool to try it out.

Our testing tools are well documented.

There are a lot of… variations on the things we recommend in terms of improving things if you run into problems.

So I’d like to check out our guides on JavaScript and SEO and think maybe,…try things, make sure it actually works the way you want it to and then take that to improve your website overall. “

User-friendly website experiences

Muller then discussed usability because the person who asked the question mentioned that the site is not user friendly with JavaScript turned off.

The vast majority of sites on the Internet use JavaScript, and W3Techs publishes a statistic that 97.9% of sites use JavaScript.

HTTPArchive, which uses actual Chrome user data from subscribed user feedback in its annual JavaScript report, uses an average of 20 JavaScript downloads for mobile devices and up to 33 first-party JavaScript and 34 third-party scripts for 90 percent of sites .

HttpArchive further points out that for average websites, 36.2% of the JavaScript imposed on a site visitor’s browser goes unused, it’s just wasted bandwidth.

As you can see, the problem is not with users who have turned off JavaScript to visit a site, the person asking the question was worried about. Their concern was misplaced.

The real problem is focused on users who are facing a site that is forcing a lot of JavaScript on the site’s visitors thus creating a bad user experience.

Mueller did not mention the nuances of how to misplace one’s fears. But it recommended useful ways to find out if users are experiencing negative experience due to JavaScript issues.

Mueller continued his answer:

“And you mentioned that it’s very easy to use in terms of JavaScript, so from our point of view the instructions we have are basically very technical in the sense that we need to make sure that Googlebot can see the content from a technical point of view, and that it can see the links on your pages from a technical point of view Artistic.

It is not primarily concerned with ease of use.

But of course your users care about ease of use.

This is something that might make sense to do a little more so your users are sure they are having a good experience on your pages.

Often this is something that is not just about a simple testing tool.

But rather something where you might have to do a little user study or some kind of interview with some of your users or at least do a survey on your website to understand where they’re stumbling, and what kind of problems they are facing.

Is this because… You mentioned the popups. Or maybe it’s something completely different where they’re seeing issues, or maybe the text is too small, or they can’t click buttons properly, those kinds of things that don’t really line up with technical issues but are more of a user-side kind of thing that if you can improve And if you can make your users happier, they will keep working and they will come back and invite more people to visit your website as well.”

Testing for Users and Google

Mueller did not explicitly refer to any tools to perform any of the recommended tests. It’s fairly obvious that Search Console is the best tool for diagnosing crawl issues with Google. Search Console alerts publishers to the number of URLs being discovered, eg.

As for user experience tools, one of the best among them is the free Microsoft Clarity user experience analytics tool. This GDPR compliant analytics tool provides insights into how users experience your site and can indicate when they encounter a poor user experience.

So it can be very useful in diagnosing potential site issues that John Mueller discussed.

the quote

Watch John Mueller at 10:23:

Featured Image: Elle Aon / Shutterstock

Leave a Comment

Your email address will not be published.