In this blog, you will learn:
- personalise your website
- deliver notifications
- load new content as you scroll down a page
- and much, much more!
To better understand the process, let’s compare it to how Google crawls an HTML page.
First of all, the process begins with HTML being downloaded, links too are extracted and CSS files are downloaded as well. These elements are then sent off to Google’s aptly named indexer, Caffeine, which then indexes the page. Sounds simple enough.
You can read about Google’s Caffeine indexer in more detail, from Google’s official documentation here.
This is even more crucial for eCommerce websites. It would mean that pages with the most best-selling products or promotion pages (such as Black Friday or Christmas sales) would be completely left out of the SERPs. There wouldn’t even be a chance for users to discover your page, or brand, in the first place. This would have a detrimental effect on revenue targets as well as hamper your presence as an online retailer.
- Carrying out the diagnosis and then troubleshooting of ranking issues for websites, as well as single-page applications
- Making sure priority pages on a website can be discovered by search engines via internal linking best practices
Why SEO’s Need to Work with Developers
Whereas the use of HTML might not be as appealing, it would be more compatible with SEO, as Googlebot, or a crawler from any other search engine, will be able to successfully read and then render the page to the user.
We can also categorise the forthcoming recommendations into four groups, which are:
This should help to comprehend how each of the tactics can fit into our potential SEO strategy.
To truly understand dynamic rendering, i’ll need to first provide some context and discuss the other two types of rendering options: server-side rendering (SSR) and client-side rendering (CSR). Let’s begin by discussing server-side rendering first.
What is Server-Side Rendering?
However, the main issue that arises here is server-side rendering can be challenging and complicated for developers. As SEOs, we want to keep our recommendations in line with Google guidelines as well as work collaboratively with developers and help them to create an SEO friendly website. Therefore, SSR may not be the easiest option to use from a development point of view.
What is Client-Side Rendering?
A visual representation, by Google, can be found below:
A simple way to solve this issue is by adding a very basic item of code, which can be found below, to you’re robots.txt:
Much like the issue with robots.txt file, this issue also has a simple solution. You just need to select the iteration you want to be indexed and set a canonical tag.
4. Optimise Your Error Pages
So, when it comes to optimising error pages there are two main solutions:
- And the other is to add a no index tag to the page as well as a message to go with “this page could not be found.” Of course, it’s up to you how creative you want to be with your error message. It is important to note that this error page will be treated as a soft 404 as the actual status code will be a 200.
The solution is to only use HTML anchor tags, with href attributes, as well as descriptive anchor text, to clearly inform the user and search engines where the link leads, when including internal links in your site.
Example code can be found in the below image:
To find out more about internal linking, read our blog post, from our senior strategist, Callum Lockwood, on internal linking strategies.
7. Ensure the Use of Pagination
The use of pagination is particularly important for eCommerce websites, especially if you have a large catalogue of products displayed on a page. While endless scrolling down a page may appear to be useful from a user perspective, it is not the most friendly when it comes to SEO.
This is because search engine bots would not be able to scroll down the page the same way a user would be able to, or click on a button which says “view more”, continuously, to access additional content. Sooner or later Googlebot, or any other search engine bot, will reach a specific limit and stop crawling your page. Thus, these pages would be neglected in the search results, since they would not be crawled as often. This would directly impact your rankings on search result pages in a negative way.
Therefore, pagination is the solution and the best way to implement this is via href links, which will allow Google to access the second page of your paginated product listing.
8. Lazy Loading Your Images is a Must
Googlebot obviously supports lazy loading. However, as mentioned earlier, when discussing pagination, Google will not scroll down a page the same way a user will. Rather, it will simply resize its own virtual viewport to be long enough to view and crawl the content of the web page. This becomes an issue when lazy loading images.
What this means is that Google will not be able to trigger what is known as the “scroll” event, which is when a user scrolls down a page, the images/content are loaded, however, since Google changes the viewport of its crawler to see the content all at once, that scroll action is never enabled, therefore, the images that would appear gradually would never emerge either.
You can also see how Googlebot views content on a page differently to a user in the below diagram:
Rather, opt for using the intersectionObserver API, which triggers a callback when any observed element becomes visible. A callback is more preferable than the scroll event, as it is more flexible and robust, and as mentioned, occurs when any element is crawled by a bot. This makes it a much more compatible option for Googlebot.
There is also the ability to lazy load images within the browser directly. This is referred to as browser level lazy loading or native lazy loading. Chromium based browsers such as Google Chrome, Microsoft Edge, Opera (as well as Firefox) support native loading within the browser directly.
You can find an example code snippet in the image below, illustrating browser level lazy loading:
Lazy loading images will be exceptionally helpful for your eCommerce website. For example, on a category page with multiple rows of products, implementing lazy loading, as part of your page’s code, will help to render the product images that much more efficiently, as you scroll down the page. This will directly result in faster site performance for users and also search engine crawl bots.
You can read about lazy loading even further here, in this blog post by web.dev.
9. Optimise Page Speed
There are a number of solutions that you can use to overcome this problem. Here are just a few:
10. Optimise Meta Data
Essentially, what happens is when users and search engine crawlers follow the links to URLs on a website built using the React framework, they are not actually being served various static HTML files. Instead what’s happening is that the components, such as the headers, footers and body content on the root ./index.html file are just being reshuffled and organised to present different content to the user. Hence the name ‘single page application’. In other words, rather than serving multiple pages to the user, the contents of the current page are simply reorganised to be displayed differently. Examples of single-page applications include sites like Netflix, Gmail and PayPal.
- Ensure the use of dynamic rendering
- Use pagination to help Googlebot crawl your products rather than continuous scrolling
- Make sure to use lazy loading in the browser to improve page speed
- Finally, optimise your pages’ metadata
To put it in a more succinct manner, i’ll leave you with some words from Google’s John Mueller:
“The web has moved from plain HTML – as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS is not going away.”