Experiment: does Google crawl dynamic content?

by Admin




Article by Axandra


Article by Axandra
SEO software

Earlier this year, Sander Nagetegaal from Centrical.com did a series of experiments to find out if Google crawls dynamic content. What did he find out and what does this mean to the Google rankings of your own website?


What did he test?

Many modern content management systems dynamically insert content on web pages through JavaScript code. If Google cannot crawl that content, these web pages will  look as if they have no content at all. Sander tested several different scenarios:

  1. Content injection before DOM has loaded
  2. Content injection after DOM has loaded
  3. Content injection by async javascript
  4. Content injection after httpRequest
  5. JSON-LD content
  6. JSON-LD injection
  7. JSON-LD injection, asynchronously
  8. JSON-LD injection after httpRequest
  9. JSON-LD injection with Google Tag Manager
  10. Meta elements injection
  11. Meta elements injection, asynchronous

What did he find out?

Sander Nagetegaal found out that Google crawls and indexes all content injected by Javascript, independent of whether it is synchronously or asynchronously injected. However, JSON-LD content fails to show up in the search results. Here are his results:

  • Google crawls and indexes all content that was injected by Javascript.
  • Google even shows results in the SERPs that are based on asynchronously injected content.
  • Google can handle content from httpRequest().
  • However, JSON-LD as such does not necessarily lead to SERP results (as opposed to the officially supported SERP entities that are not only indexed, but also used to decorate the SERP).
  • Injected JSON-LD gets recognized by the structured data testing tool - including Tag Manager injection. This means that once Google decides to support the entities, indexing will not be a problem.
  • Dynamically updated meta elements get crawled and indexed, too.

What does this mean to the rankings of your own website?

Two months ago, Google discontinued its supports for its previous proposal to make AJAX web pages crawlable by Googlebot. As the experiments above show, Google can index the content anyway now.

However, it might still be a good idea to make sure that web crawlers that cannot index JavaScript can also access your content. In addition, Google's Gary Illyes said that contents won't be indexed by Google if you put them in a JavaScript array that only expands the content when you click.

Google can index content that is delivered by JavaScript. If you must use JavaScript to serve the content of a website, this shouldn't cause any problems with Google. However, other search engines might not be as good as Google when it comes to JavaScript indexing.

If possible, make sure that the content of your web pages is accessible from all devices, with or without JavaScript enabled. The tools in our web-based website promotion tool SEOprofiler help you to make sure that Google and other search engines can index your pages correctly.

In addition, the tools help you to make sure that potential customers find your website through the right keywords. If you haven't done it yet, try SEOprofiler now.

Article by Axandra
SEO software



News Categories

Ads

Ads

Subscribe

RSS Atom