Single Page Application SEO: a checklist of what you need to be aware of

How to approach the challenge of Single Page Application SEO on a JavaScript website

BlogUPDATED ON August 7, 2023

John Adam K&C head of marketing


4 tiles QA and software testing

Over the past few years, single-page applications (SPAs) have gained in popularity. But SPA SEO is a challenge. Organisations opt for single-page solutions because the pluses are substantial for both users and web developers. These web development solutions are fast and user-friendly, support ReSTful APIs and enable the distribution of processing workload between the server and client computers. And it is much easier to convert a single-page application into a native mobile app.

But with on-page SEO best practices heavily revolving around internal linking between different pages optimised for different search terms, an alternative approach is needed for single-page applications. The reality is, single-page application SEO is more difficult than for a traditional multi-page web application. But it is still more than possible. Here’s how the K&C team approaches the challenge:

When does IT Outsourcing work?

(And when doesn’t it?)

Single Page App SEO 101 – building the sitemap.xml

The sitemaps protocol is vital to SEO regardless of if the application being optimised is single or multi-page. It informs search engines which pages on our website are available for crawling. That’s how Google and other search engines are able to understand what your application is about and which user searches it will offer valuable information in response to.

A sitemap is an .xml file that lists URLs for a site. There you can specify information about each page: last update time, change the update frequency and how important it is in relation to other URLs on the site. Search engine web crawlers like Googlebot read this file to crawl your site in an informed way, making the process more efficient.

You can use crawling tools such as Screaming Frog to help you easily create a perfect sitemap and the recently released ChatGPT-3 AI tool is another that can help you quickly and easily create a sitemap without any coding knowledge or previous experience of the process.

Single Page Application SEO Sitemap.xml example:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="" xmlns:xhtml="">

Creating a sitemap doesn’t guarantee that all the items in your sitemap will be crawled and indexed immediately. However, your site will certainly benefit from having a sitemap and content will be indexed more quickly and efficiently than if you don’t.

The JavaScript accessibility challenge for Single Page App SEO

Single-page applications are often, even typically, built with JavaScript. But as search crawlers like Google don’t read JavaScript as well because it uses client-side rendering, they find it difficult to access and ‘read’ the text content published on SPAs, unless the technical set-up lends a helping hand. So how do you help a search engine crawler understand the content published on a single-page application without using an ugly “#!” in its URL?

To inform a search crawler that your application is a single page, you have to add <meta name=”fragment” content=”!”> tag at the top of site. But, the best way to indicate pages that should be indexed is to use a sitemap.XML file. It’s like saying to the search engine: “I’d appreciate if you could focus on these particular URLs.” And search engine crawlers in turn appreciate polite web developers!

The sitemap allows you to indicate canonical URLs (non-canonical are not applicable) of the site with page priority, last modification date, change frequency and, crucially for multi-language websites, returning <hreflang> links.

It is worth mentioning that sitemaps require some promotion. You want to make sure a search crawler’s first stop is a robot.TXT file. Adding the following line: “Sitemap:” will point out your sitemap’s location and make sure it is a crawler’s first top on its initial and all subsequent visits.

The second step to good sitemap promotion is then submitting it to the search webmaster tools sites. This is a good way to prompt a crawl, which will usually then take place over the next few hours.

Client-side rendering

JavaScript’s client-side rendering also comes with SEO considerations. Google’s Martin Splitt explains:

“If you use a JavaScript framework, the default is client-side rendering. This means you send the bare-bones HTML over and then a piece of JavaScript, and the JavaScript fetches and assembles the content in the browser.”

Botify suggests thinking of client-side rendering like ordering furniture from IKEA. It’s not already assembled but comes flat-packed in parts that have to be assembled.

Client-side rendering in JS appsSource: Botify

One drawback of client-side rendering is how search engine bots like Googlebot access content. Googlebot has something called a second wave of indexing which means the HTML of a page is crawled and indexed first. The bots then come back to render the JavaScript when resources become available. This two-phased approach means that sometimes, JavaScript content can be missed, and not included in Google’s index.

Creating social media meta tags

Integrating a web application with social media is vital in the modern online space. Again, that is the case for single-page application SEO or multi-page application SEO. However, it is especially crucial for the former. Protocols like Open Graph optimize and structure the information we want to share in social networks.

Originally created for Facebook, the Open Graph protocol is now used to control the data a user shares through the URL link to third-party website content.

To integrate OG (Open Graph) to your Website, all you need to do is to put special <meta> tags into <head> section of the HTML page you want to make available to share.

OG meta tags are responsible for how your web page will look like when shared in social media. When the user shares a URL link for the first time, Facebook’s crawler analyzes the page, collects information about it and creates a graphical object, which will then be shown on Facebook pages.

There are several required tags for OG:

– og: title – the name (e. g. of the article);

– og: description – short description of your data content;

– og: type – the data type of page content (the default is “website”);

– og: image – URL address of the picture to represent the page;

– og: url – canonical URL of the page.

If the page doesn’t have OG <meta> tags, the Facebook crawler will automatically search for required content and independently decide how best to deliver the information found on your page. This may not correspond to your own preferences.

Setting up Open Graph meta tags on your page is the best way to integrate the website nicely with social networks. This is something that is easy to do if you have previous experience with meta tags.

  <meat property="og:title" content="Some Title"/>
  <meat property="og:description" content="Short description"/>
  <meat property="og:type" content="article"/>
  <meat property="og:image" content=""/>
  <meat property="og:url" content=""/>

Add link canonical

The rel=canonical link element is an HTML element that helps web developers avoid duplicate content. Using it will improve a single-page application’s SEO as Google’s bots are not keen on identical or very similar content across a website.

The idea is simple: if you have a few similar pieces of content, which can confuse Google as to which to serve (sometimes resulting in none of the pages with duplicate content being very visible in the SERPs – search engine results pages) you choose one version and make it “canonical”. Search engines will then focus on your chosen piece of content, largely ignoring the other duplicate or similar instances.

Choosing a proper canonical URL for every set of similar URLs improves the SEO of your site. Because the search engine knows which version is authoritative, it can count all the links towards all the different versions, as links to that single version.

If you want to use rel=”canonical” link element for single-page applications, you have to generate the URLs dynamically.

Also, keep in mind that canonical and ‘sitemap.xml’s URLs must be the same!

Single-Page Application SEO –  challenge accepted

It is safe to say, single-page application SEO does represent challenges and probably wouldn’t be the first choice for an SEO consultant. However, JavaScript’s default inaccessibility to search engine crawlers can be worked around as explained. The single-page structure does limit the intra-page linking and content silo approach that SEO experts like to work with.

But it is certainly not impossible to index your single-page app on Google and other search engines and have it rank well for your key terms. However, it does mean some additional effort should be expected if business considerations see you opt for a single-page application.  

K&C – Munich-based nearshore IT outsourcing provider

K&C (Krusche & Company) is a Munich-based IT outsourcing company that recruits tech talent in nearshore and midshore markets. We complement or provide your software development and other IT resources flexibly and conveniently through: 

  • IT team augmentation  
  • Dedicated teams  

You can choose from flexible cooperation models that range from body leasing to full project/delivery management.  

We have physical offices in 4 near and mid-shore locations and a remote presence across many more.  

Our USP is IT outsourcing partnerships lasting up to over 20 years with several of Germany and Europe’s most recognisable brands. We promise to serve your IT outsourcing and staffing needs just as well.  

Core competencies: web development, Cloud native development, mobile development, DevOps, QA/testing, UI/UX design, data engineering, blockchain.  

Can We Help You With Your Next JavaScript Project?

Flexible models to fit your needs!