Single Page Application SEO: Tips & Tricks

How to approach the Challenge of Single Page Application SEO on a JavaScript website

BlogUPDATED ON May 26, 2021

Author

4 tiles QA and software testing

Over the past few years, single-page web applications and their frameworks have gained immense popularity. But single page application SEO is a challenge. Organisations opt for single-page solutions because the pluses are substantial for both users and web developers. These web development solutions are fast and user-friendly, support ReSTful APIs and enable distributing the processing workload between the server and client computers. And it is much easier to convert a single page web application into a mobile one.

But with on-page SEO best practices heavily revolving around internal linking between different pages optimised for different search terms, an alternative approach is needed for single-page applications. The reality is, single page application SEO is more difficult than for a traditional multi-page web application. But it is still more than possible. Here’s how the K&C team approached the challenge for one of our clients.

When does IT Outsourcing work?

(And when doesn’t it?)

Single Page App SEO 101 – Building The Sitemap.xml

The Sitemaps protocol is vital to SEO regardless of if the application being optimised is single or multi-page. It informs search engines which pages on our website are available for crawling. That’s how Google and other search engines are able to understand what your application is about and which user searches it will offer valuable information in response to.

A Sitemap is an .xml file that lists URLs for a site. There you can specify information about each page: last update time, change the update frequency and how important it is in relation to other URLs on the site. Search engine web crawlers like Googlebot read this file to more intelligently crawl your site.

You can use sitemap tools such as Screaming Frog to help you easily create a perfect sitemap.

Single Page Application SEO Sitemap.xml example:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="https://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="https://www.w3.org/1999/xhtml">
  <url>
    <loc>https://kruschecompany.com/</loc>
      <xhtml:link
        rel="alternate"
        hreflang="de"
        href="https://kruschecompany.com/de/"
     />
     <xhtml:link
        rel="alternate"
        hreflang="en"
        href="https://kruschecompany.com/"
     />
     <lastmod>2016-05-17</lastmod>
     <changefreq>yearly</changefreq>
     <priority>1.0</priority>
  </url>
</urlset>

Creating a sitemap doesn’t guarantee that all the items in your sitemap will be crawled and indexed immediately, as Google’s processes rely on complex algorithms to schedule crawling. However, your site will certainly benefit from having a sitemap, and you’ll never be penalized for having one.

The JavaScript Problem For Single Page App SEO

Single-page applications are built on JavaScript. But since search crawlers like Google don’t read JavaScript as well because it uses client-side rendering, the content of single page application becomes less accessible to them. So how do you allow a search engine crawler to see your single-page application without using an ugly “#!” in your URL?

To inform a search crawler that your application is a single page, you have to add <meta name=”fragment” content=”!”> tag at the top of site. But, the best way to indicate pages that should be indexed is to use a sitemap.XML file. It’s like saying to the search engine: “I’d appreciate if you could focus on these particular URLs.” And search engine crawlers in turn appreciate polite web developers!

The sitemap allows you to indicate canonical URLs (non-canonical are not applicable) of the site with page priority, last modification date, change frequency and, crucially for multi-language websites, returning <hreflang> links.

It is worth mentioning that sitemaps require some promotion. You want to make sure a search crawler’s first stop is a robot.TXT file. Adding the following line: “Sitemap: https://www.example.com/sitemap.xml” will point out your sitemap’s location and make sure it is a crawler’s first top on its initial and all subsequent visits.

The second step to good sitemap promotion is then submitting it to the search webmaster tools sites. This is a good way to prompt a crawl, which will usually then take place over the next few hours.

The JS client-side rendering issue for SEO

JavaScript’s client-side rendering also comes with SEO considerations. Google’s Martin Splitt explains:

“If you use a JavaScript framework, the default is client-side rendering. This means you send the bare-bones HTML over and then a piece of JavaScript, and the JavaScript fetches and assembles the content in the browser.”

Botify suggests thinking of client-side rendering like ordering furniture from IKEA. It’s not already assembled but comes flat-packed in parts that have to be assembled.

Client-side rendering in JS appsSource: Botify

One drawback of client-side rendering is how search engine bots like Googlebot access content. Googlebot has something called a second wave of indexing which means the HTML of a page is crawled and indexed first. The bots then come back to render the JavaScript when resources become available. This two-phased approach means that sometimes, JavaScript content can be missed, and not included in Google’s index.

Creating Social Media Meta Tags

Integrating a web application with social media is vital in the modern online space. Again, that is the case for single page application SEO or multi-page application SEO. However, it is especially crucial for the former. Protocols like Open Graph optimize and structure the information we want to share in social networks.

Originally created for Facebook, the Open Graph protocol is now used to control the data a user shares through the URL link to third-party website content.

To integrate OG (Open Graph) to your Website, all you need to do is to put special <meta> tags into <head> section of the HTML page you want to make available to share.

OG meta tags are responsible for how your web page will look like when shared in social media. When the user shares a URL link for the first time, Facebook’s crawler analyzes the page, collects information about it and creates a graphical object, which will then be shown on Facebook pages.

There are several required tags for OG:

– og: title – the name (e. g. of the article);

– og: description – short description of your data content;

– og: type – the data type of page content (the default is “website”);

– og: image – URL address of the picture to represent the page;

– og: url – canonical URL of the page.

If the page doesn’t have OG <meta> tags, the Facebook crawler will automatically search for required content and independently decide how best to deliver the information found on your page. This may not correspond to your own preferences.

So, setting up Open Graph meta tags to your page is the best way to integrate the website nicely with social networks. This is something that is easy to do, if you have previous experience worked with meta tags before.

<head>
  <meat property="og:title" content="Some Title"/>
  <meat property="og:description" content="Short description"/>
  <meat property="og:type" content="article"/>
  <meat property="og:image" content="https://example.com/progressive/image.jpg"/>
  <meat property="og:url" content="https://example.com/current-url"/>
</head>

ADD LINK CANONICAL

The rel=canonical link element is an HTML element that helps web developers avoid duplicate content. Using it will improve a single page application’s SEO, as Google’s bots are not keen on identical or very similar content across a website.

The idea is simple: if you have a few similar pieces of content, you choose one version and make it “canonical”. You then inform search engines who will focus on your chosen piece of content, largely ignoring the other duplicate or similar instances.

Choosing a proper canonical URL for every set of similar URLs improves the SEO of your site. Because the search engine knows which version is authoritative, it can count all the links towards all the different versions, as links to that single version.

If you want to use rel=”canonical” link element for single-page applications, you have to generate the URLs dynamically.

Also, keep in mind that canonical and ‘sitemap.xml’s URLs must be the same!

Single Page Application SEO – Challenge Accepted

It is safe to say, single page application SEO does represent challenges and probably wouldn’t be the first choice for an SEO consultant. The JavaScript framework’s inaccessibility to search engine crawlers can be worked around as explained. The single page structure does limit the intra-page linking and content silo approach that SEO experts like to work with.

But it is certainly not impossible to index your single page app on Google and other search engines. However, it will mean some additional time and effort should be expected. And the business considerations around opting for a single page application may well mean it is worth the extra SEO effort.  

K&C – Munich-based Web Developers, DevOps and Cloud Services

Krusche & Company was established in Munich over 20 years ago now. A full tech stack IT services company, our agile development teams and consultants have super-charged the business results of some of Europe’s best-known brands as well as top SMEs and highly promising start-ups.

Our German management and nearshored development talent offer the perfect blend of efficiency, reliability and price. If you need some outsourced help you can rely on to develop a single page web application, a full enterprise-level portal or migrating your digital architecture to the cloud, we’d be delighted to hear from you.

Can We Help You With Your Next JavaScript Project?

Flexible models to fit your needs!

Related Service