Improve SEO with Akamai and

Akamai works easily with a prerendering service like Since search engines can only see javascript tags when they crawl sites, a prerender service can improve your SEO by rendering your content in javascript for you.

This article covers the basics of a prerender service and gives you a rundown of how you can get started.

How Does Work?

More and more websites are built using Javascript Frameworks like React, Vue, Angular, Backbone and require JS execution for displaying content.

Browsers and real users aren’t the only ones consuming your content; search engines and social networks are crawling your site, too. When they crawl your website they only see the javascript tags and not the content.

Although Google and some smarter bots have improved and can crawl pages more effectively today, there is often a drop in your SEO score. Javascript execution is expensive and the pages are added to a queue to be crawled for a second time by a JS enabled crawler. This second run can be several days/weeks later and therefore have a severe impact on your SEO efforts.

In the SEO world, the trend is to serve pre-rendered HTML pages when a crawler visits the site.

Prerender dot io is a leader in prerendering services. The service executes the JavaScript in a browser, saves the static HTML and will return the static HTML to crawlers.

The service expects the full URL as the path.

Here’s the difference between a Standard URL and a Pre-render URL

A URL with will appear with the base URL:

  • Standard page:
  • Pre-rendered page: and Akamai

Akamai itself does not offer prerendering, however, we work nicely together with a prerender service like in both deployment options (customer origin or SaaS offering)

Here is how you can easily add a service like into your delivery.

  1. Detect crawlers
  2. Send crawler requests to prerender service
  3. Modify request path

Step 1: Detect crawlers

Detecting the key crawlers can be done based on the user-agent request header. (If you use Bot Manager, you’ll have more flexibility and better results).

Step 2: Send crawler requests to prerender service

There’s nothing fancy to do here. Whenever the user agents are detected we forward them to another origin (either or to the customer web server running Make sure to set these two key settings to origin hostname:

  • Cache key hostname: Origin Hostname
  • Forward host header: Origin Hostname

Note: Depending on the setup, a token might be required to send to the service (this can be done using the modify outgoing request header behavior).

In practice this means that you need to set the x-prerender-token header in the forwarded request to the token you got from the prerender website.

Step 3: Modify request path

We still need to change the outgoing request path. It needs to be the full URL. This can be achieved by using property variables and modifying the outgoing request path behavior.

Step 4: Do not cache

A natural next step would be to cache the static results on the Edge. However, although crawlers typically generate a lot of traffic, they visit each page only once per crawl; By the time they crawl the same page for the second time, the cached version is either evicted or no longer up to date.

So checking the edge cache, checking the parent cache, and going forward to origin is slower than going straight to origin. Going to origin gives you the benefits of SureRoute and persistent connections.


This is a very simple way to use Akamai to solve this common problem as people move to JS Frameworks. Using Akamai and will allow you to quickly improve your SEO without needing to manually shift meta tags or translate your code.

We hope this improves your SEO and makes your life a little bit easier!