Planning is the most important part of a re-platforming project.
To ensure things go smoothly, we need an awareness of what will likely change. Site structures, site URLs, HTML schemas, internal linking — a re-platforming can lead to lots of change that, if not handled correctly, can spook the search engines.
From an SEO point of view, unless it has to change, it’s often best for it not to change. Minimising the impact provides a solid base and maintains the authority the site has built up over time. However, we know change is a must for most, so that leads to step 2…
A pre-launch audit also allows for benchmarking current metrics to have something to compare with your new site’s performance. An audit should include all key performance indicators (KPIs), including traffic, keyword visibility, and authority.
It should also include page speed and indexed pages.
We use three pieces of software to do this for us and then combine them, ensuring all the bases are covered.
Using the audit of the existing store, we have a list of the indexable and non-indexable URLs on the current domain.
Next? Match them to their equivalent in the new store, paying close attention to where there isn’t a pattern match (we’ll deal with those via a 404). Transferring the authority in this one-to-one manner via 301s ensures a seamless experience for our customers and the search engines.
We’ll also review content and see opportunities to add, delete or change existing content to give us an uplift in natural rank. After all, we’re here to improve things, not just move from one store to another.
Take your crawl results from the audit to optimise metadata and/or alt text where it is either missing, incorrect or duplicated.
Creating descriptive and unique meta and descriptive alt tags for all of your images will not only benefit your SEO strategy but also increases site accessibility.
Win-win.
An often overlooked part of the process is ensuring internal links are changed to point at the new URLs instead of letting the redirects created above handle them.
Not updating old links to the new ones in your body content will result in redirect chains which Google isn’t a fan of (and it is not a great experience for the user, either).
It’s pretty likely that somewhere in your site, you will have pages that are targeting the same keywords. Some of these may be duplicate content pieces — sometimes created by a mishandling of the URLs (www and non-www or product variation links).
Now is the time to consider your canonicalisation strategy for content pieces whilst considering how to handle things such as product variation URLs (if not designed/handled properly, you can end up in a bit of a mess).
The benefit of structured data and specific markup schemas is that the search engine can pull out specifics from your page and use that in rich search results. You’ll have seen products in the search engine pages where this has been done have product prices, perhaps an image or product availability attributes added directly to the listing.
These are beneficial as they tend to drive a higher click through rate from search engine results.
Updating robots.txt and XML sitemaps help search engines better understand how to index and crawl your site. It is also best practice to list the location of your XML sitemap within your robots.txt file.
It’s important to carry over any settings from your current site by blocking all unnecessary pages from being crawled on your site and adding any other pages that do not belong in the index.
Platforms such as Shopify and BigCommerce do a good job of this automatically, but there may be other page types you might want blocking from results.
As part of our process, we regularly take stock of how the store being developed is shaping up concerning site speed. Our engineering team monitor this against the project’s performance budget, and it can lead to conversations around managing the real-world performance against the desired feature list.
If *that* feature is damaging the page speed significantly, is it worth it? (Perhaps?!)