The results of a JavaScript rendering and indexing experiment highlight some challenges of running JS-dependent content.
I recently read Ziemek Bucko’s fascinating article, Rendering Queue: Google Needs 9X More Time To Crawl JS Than HTML, on the Onely blog.
Bucko described a test they did showing significant delays by Googlebot following links in JavaScript-reliant pages compared to links in plain-text HTML.
While it isn’t a good idea to rely on only one test like this, their experience matches up with my own. I have seen and supported many websites relying too much on JavaScript (JS) to function properly. I expect I’m not alone in that respect.
My experience is that JavaScript-only content can take longer to get indexed compared to plain HTML.
I recall several instances of fielding phone calls and emails from frustrated clients asking why their stuff wasn’t showing up in search results.
In all but one case, the challenge appeared to be because the pages were built on a JS-only or mostly JS platform.
Before we go further, I want to clarify that this is not a “hit piece” on JavaScript. JS is a valuable tool.
Like any tool, however, it’s best used for tasks other tools cannot do. I’m not against JS. I’m against using it where it doesn’t make sense.
But there are other reasons to consider judiciously using JS instead of relying on it for everything.
Here are some tales from my experience to illustrate some of them.
1. Text? What text?!
A site I supported was relaunched with...
Read Full Story: https://news.google.com/__i/rss/rd/articles/CBMibGh0dHBzOi8vc2VhcmNoZW5naW5lbGFuZC5jb20vamF2YXNjcmlwdC1yZW5kZXJpbmctYW5kLWluZGV4aW5nLWNhdXRpb25hcnktdGFsZXMtYW5kLWhvdy10by1hdm9pZC10aGVtLTM5MDAxMdIBAA?oc=5
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.