by Admin
Article by Axandra
Article by Axandra
SEO software
In a webmaster hangout on YouTube, Google John Mueller and Google's Martin Schmidt answered questions about how Google can handle JavaScript on websites. Read this article if you want to find out how to make sure that Google can parse the content of your web pages.
1. Avoid client-side rendering
Client-side rendering means that the client gets a file with very little HTML and the content is then created by JavaScript. This can lead to problems with search engine robots that cannot handle JavaScript. A popular client-side JavaScript framework is AngularJS.
Google recommends that websites with fast-changing content and large websites should avoid client-side rendering because it can cause UX issues, and it can cause a delay in indexing. Google recommends dynamic rendering, i.e. web crawlers should get an easy to parse static HTML page.
Google can parse some JavaScript, but most search engine robots can't. For example, client-side rendering is not supported by the bots of Facebook and Twitter.
To find out what search engines can see on your website when they do not process the JavaScript on your pages, check your pages with the website audit tool. The website audit tool shows you what search engine robots can find on your pages.
2. Google can process JavaScript redirects
As long as you do not disallow the pages with the JavaScript redirect, Google can handle JavaScript redirects. These redirects are treated as regular redirects.
3. Don't trigger lazy-loading elements by scroll events
Lazy loading means that a web page element won't be loaded until the point at which it is needed. For example, an image at the bottom of a web page might only be loaded if the browser of the web page visitor displays the bottom of the page.
Google says that you shouldn't use scroll events to trigger lazy-loading. Desktop users might resize their browser window to get more content so the scroll event wouldn't be triggered in that case. More importantly, Google does not scroll so that lazy-loading content wouldn't be visible to Google.
4. You don't need to specify what Google should render
It is Google's job to parse the content of your web pages. You don't have to implement anything on your website to tell Google what they should render. Not rendering particular elements on your web pages can also cause problems.
5. If possible, serve content without JavaScript
Critical JavaScript files with a significant file size shouldn't be served in the head of a web page because they can delay rendering. Users will have to wait longer before seeing any content if large JavaScript files are required to show the page. If possible, server priority content as quickly as possible without JavaScript.
6. Google can ignore third-party scripts
If Google finds third-party scripts that aren't useful to render a page, they will avoid fetching these scripts.
7. Full URLs in JavaScript can be used for crawling
JavaScript links aren't the same as regular HTML links. However, if Google finds a full URL in a JavaScript link, they will try to follow it. Note that you should not rely on this. In general, Google will not crawl JavaScript links.
Make sure that search engines can index your pages
If possible, deliver your web page content in plain HTML to search engines. The easier it is to parse your web pages, the more likely it is that your content can be indexed correctly.
To make sure that your web pages can be parsed by all crawlers, check your web pages with the website audit tool in SEOprofiler. You can create your SEOprofiler account here.
Article by Axandra
SEO software
|
|