JS has since mostly been used to add animations and effects to pages to make them feel more dynamic. This is inherently different from existing HTML and CSS languages where you set the basic structure and then design how the page should look.
The established approach, server-side rendering, involves a browser or a search engine bot (crawler) receiving HTML that describes the page exactly. So, the content is already in place, and your browser or search engine bot must download the attached assets (CSS, images, etc) to present how the page has been designed to. As this is the traditional approach, search engines generally do not have a problem with server-side rendered content. Traditionally websites that render like this would be programmed in PHP, ASP or Ruby and might have used popular content management systems like Kentico, WordPress or Magento.
As shown in Google’s diagram, Googlebot places pages within a queue for their crawling and rendering processes. From here, Googlebot fetches a URL from the crawling queue and reads the robots.txt file to see whether the URL has been disallowed or not.
As Google is running two separate waves of indexing, it is possible for some details to be missed during the indexing process. For example, if you’re not server-side rendering crucial title tags and meta descriptions, Google may miss this on the second way, having negative implications on your organic visibility on the SERPs.
What is the difference between crawling and indexing?
Crawling and indexing are two different things that can be confused within the SEO industry. Crawling is associated with a search engine bot, such as Googlebot, discovering all the content or code that is on a web page and analyses it. Indexing, on the other hand, means that the page is likely to show up in the Search Engine Results Page (SERPs).
Evergreen Google Bot
It should be noted that Googlebot is now running on the latest Chromium rendering engine (74 at the point of this post’s publication) when rendering pages for search. Moving forward, Googlebot will regularly update its rendering engine to ensure support for latest web platform features.
Search engines have been reported to deploy headless browsing, a type of software that can access web pages, but does not show the user and transfers the content of the web page to another program that runs on the backend. A headless browser helps to render the DOM to gain a better understanding of the user’s experience and the content situations on the page.
Above the fold refers to the part of the web page that is visible when the page initially loads. The subsequent portion of the page requires scrolling is called “below the fold”. This can be apparent across a range of devices, including; desktops, mobiles, iPad’s and many more. In order to remove this, please refer to the section below that outlines key tools to use.
Single Page Application considerations
A single page application (SPA) is a web application/ website that has been primarily designed and built to operate efficiently on the web. These pages are dynamically rewritten and loaded with the pieces you require, as opposed to loading an entire page from the server.
The SPA approach provides a fast loading time, uses less bandwidth and provides the user with a pleasant experience by making the application behave more like a desktop application. It should be noted that there are many different SPA framework options available, dependant, on the use of the application. These include; React.js, Angular.js, Backbone.js, Preact, Next.js and hundreds more.
When rendering SPAs, John Mueller stated the following.
It’s not always perfect, and certainly not easy, but for some sites it can work well, even if you rely on client-side-rendering (just JS, no server-side-rendering). YMMV 🙂
— 🍌 John 🍌 (@JohnMu) July 16, 2018
There can be many hurdles when Google attempts to crawl and index the content on the SPA. Therefore, if you’re using SPAs, it’s recommended to test out multiple times, using the Fetch command, to understand what Google is able to pick up.
Although Google has not been officially stated, it’s noted that Google shouldn’t wait longer than 5 seconds, although John Mueller has indeed stated this will be challenging for many sites. Therefore, any content in by the load event, around 5 seconds, will be indexable.
Pages require indexable URLs that offer server-side support for each landing page. This includes each category, subcategory and product page.
With each individual page on site having a specific focus and target, they should also include descriptive titles and meta descriptions to positively help search engine bots and users to precisely detect what the page is about. Not only this, but it helps users determine if this is the most suitable page for their search query.
Using your browser’s “Inspect” feature
Once the rendered HTML has been obtained and meets the level of a traditional landing page expected by Google, many impacting factors will solve themselves.
URL Inspection tool in GSC
The URL inspection tool allows you to analyse a specific URL on your website to understand the full status of how Google is viewing it. The URL inspection tool provides the valuable data around the crawling, indexing and further information from Google’s index, such as successful indexation or structured data errors causing issues.
Improve the page loading speed
- Produces the same result and is supported on all modern browsers
- Allows for a user-friendly, highly interactive build of websites
- Can be used by both front-end and back-end developers
- Site speed
- Search engine crawling and indexing
However, depending on which rendering method you use, you can reduce page load speed and make sure content is accessible to search engines for crawling and indexing.
URL Inspection Tool
Found within Google Search Console, the URL Inspection Tool shows information on whether Google was able to crawl / index a page, whether any errors are occurring and why.
Google’s mobile-friendly tester provides information regarding how easy it is for a visitor to navigate your website on a mobile device.
Page Speed Insights
Google’s page speed insights tool (PSI) effectively details the performance of mobile and desktop devices. In addition to this, this tool also provides recommendations on how this can be improved.
The site: command is one of the most direct tools to help see if Google has properly indexed your content. In order to do this, you can complete the following command into Google – site: [your website URL] “text snippet or query”
Diffchecker is a unique tool that allows you to compare two types of text files and review the differences between both. This is especially useful for performing analysis of a webpages original source code against the rendered code. This tool delivers comprehensive comparisons to how the content has changed after being rendered.
Chrome DevTools is a set of tools for experienced web developers to build directly into Google’s Chrome browser. Chrome DevTools can help you to edit and make quick styled changes without needing to use a text editor. The help to discover problems in a fast manner, which in turn, can help to build better websites in a quicker period.
BuiltWith is a free website profiler that helps to discover what framework websites are being built with. Not only this, but it’ll also tell you any integrations with a third party website that the website has.