WHY IS NO ONE TALKING ABOUT DYNAMIC RENDERING?

A Major SEO Policy Change

Think about these statistics for a minute. In 2019, Google drove $335 billion in revenue for 1.3 million American businesses. Meanwhile, approximately 5.6 billion search queries were entered into their search engine at various stages of the customer journey in the same year.

More and more of the American economy runs through Google. And while Google’s algorithm is often misrepresented as a “black box”, they’ve actually been quite public about what they want from websites. This is especially true from a technical perspective.

So when they make SEO recommendations, shouldn’t we be listening and acting accordingly? I scratch my head seeing fellow SEOs head in the opposite direction.

In 2018, Google endorsed dynamic rendering as a Javascript SEO solution and that’s what I want to talk about in this article. In my view, it is one of the biggest policy changes they have made in the last decade. Yet very few people in our industry have acted on it.

Introducing Dynamic Rendering

Dynamic rendering is the process of loading pages differently based on what calls them. A common example is viewing a website on your mobile phone, and then getting a slightly different experience when you open that same website on a desktop. 

For SEO, there is interesting value proposition. Javascript-powered websites, for example, are able to serve a buffed-out version to humans while serving a stripped-down, flat HTML, pre-rendered version to Search Bots. As a result, Search Bots can crawl, understand, and index the content more efficiently.

The whole process is considered dynamic because your website is able to detect whether a human or Search Bot is requesting the content. You just need to add a tool or step in your server infrastructure to act as the renderer.

Dynamic Rendering Use Cases

You should implement dynamic rendering if:

Google Is Struggling To Keep Up

The internet grows more complex every day. By 2018, there were over 130 trillion documents on the web. With limited time and resources to fulfill its mission of organizing and serving the world’s information, it has made Google’s job more daunting. They need the process of crawling and indexing the internet to be fast in order to keep the SERPs as fresh as possible.

In tandem with this explosion of content, front-end coding languages have gotten more complex as well. It reached a tipping point in May 2018 when Google just sort of threw up their hands and determined the front-end was never going to get any simpler, so they would need to request separate SEO-friendly versions of websites to crawl and index instead.

This last point is critical because Search Bots don’t crawl and index all web pages in the same way. Javascript-powered websites, in particular, command significant resources from Google that force crawling and indexing to be performed in a two-wave process.

In the first wave, they render the HTML and CSS. Then the website gets put in a queue for the second wave to render the rest of the content as resources become available. The fallout here is that Javascript-powered websites may not be completely crawled for days, if not weeks, after being published. And if you have a robust content marketing operation in place, this two-wave process could set your organic channel into crisis.

The Benefits Of Dynamic Rendering

There are numerous benefits to implementing dynamic rendering and queuing up the perfect crawling experience for Google. 

Our Solution: The Huckabuy Cloud

After watching the Google I/O conference in May 2018, we decided to build a software service called the Huckabuy Cloud that leveraged dynamic rendering to give Search Bots the ultimate crawling experience. It was all part of our plan to queue up what we call “Google’s Perfect World” of websites designed specifically for their search engine. 

Huckabuy Cloud essentially takes websites with a lot of dynamic content, converts their pages into flat HTML, adds structured data markup, and serves them in caching layers so Search Bots can crawl and index the information instantaneously.

Effectively, our software is stripping out all the code bloat and non-essential aspects of a page that Google doesn’t care about. That lightens everything up, makes it faster, and then we host it. Our partnership with Cloudflare enables edge delivery of this content which further improves performance.

We have a number of enterprise customers, like SAP and EverQuote, who rely on our Huckabuy Cloud service. These websites are often very complicated with a lot of business requirements and unaccounted indexing issues. We make it so Google can just cruise through these websites in a short amount of time and find what they need. As a result, these websites are properly indexed and start commanding the search attention they deserve.

In Conclusion: Technical SEO Has Come Full Circle

As we enter into a new decade, SEO is changing in significant ways. All of the new innovations that are being rolled out or in the pipeline – things like zero-click searches, voice search, and rich results – are all powered by technical optimizations that improve the communication between your website and Google.

This facet of SEO can no longer be ignored. 

If you aren’t establishing the perfect crawl experience for Search Bots, you will miss out on valuable organic channel opportunities. Make sure it is going well and you will get due credit with Google and all the associated search exposure you deserve. 

Dynamic rendering is a key part of the equation. I hope this is a wakeup call for some.