Cromwell is Europe’s leading independent supplier of maintenance, repair and operations type products, with 54 UK branches offering collection services and daily delivery. Their distribution fleet uses a central hub in Leicester to move products overnight, allowing many products to be delivered on the day they’re ordered.
Our objectives in working with Cromwell were rooted in the need to resolve this indexing issue, helping the site to achieve much higher levels of organic visibility than it had been seeing. Not only did we want to solve the immediate problem, we also wanted to ensure that our solution would provide Cromwell with a platform for long-term organic growth.
As such, our primary ‘audience’ was Googlebot. We had to ensure that our work would make the site’s pages easier for search engine to find and crawl in their entirety. At the same time, we had to bear in mind the needs of Cromwell’s end users: professionals in the UK’s building trade.
It was therefore essential that we understand the full scale of the problem. We knew that around 25% of pages were being indexed, and we needed to know which, and why. From there, we wanted to categorise areas of the site to allow for appropriate prioritisation of workload and we made our way through the on-site issues.
Using Screaming Frog, we were able to collate a list of all available pages on the site and then categorise those according to the business’ aims, identifying immediately those which could be no-indexed, thus improving crawl efficiency from the start by removing pages that didn’t need to be indexed, allowing more time for Google to access important content.
We built a script that allowed us to pull Cromwell’s current site using Google’s headless browser “Headless Chrome”/”Puppeteer”, so we could view the site as Google’s crawlers did, which enabled us to uncover specific rendering issues within each page and area of the site and educate the client on where problems lay. This also allowed us to see internal linking issues, which could then be addressed.
Implement server-side rendering to facilitate effective crawls
We recommended server-side rendering as a solution to facilitate effective crawls. Server-side rendering means that Google is not expected to render the content for itself, meaning we are in full control of what we present to Googlebot.
This was no small feat, with the site having a large number of pages. We worked closely with our client’s development team to implement this and are very proud of what was a very smooth migration.
Update site structure to facilitate effective internal linking
Most recently we’ve been creating the perfect structure across the site for internal linking and removing duplicate content issues.
To do this, we’ve used Screaming Frog to create a granular sitemap to categorise content across the website and identify which areas of the site were not being crawled and where there was a lack of internal links.
Through this approach, we have been able to identify ‘hidden’ areas and bring them to the forefront through the addition of new internal links, canonicalising or redirecting content where duplication occurred and thus further optimising the crawl for Googlebot whilst improving the overall user experience too.
The following graph shows how organic visibility has grown as a result of our work:
Search visibility from 26th May 2018 - 20th September 2018 vs previous period. Organic impressions taken from Google Search Console.
Your digital strategy should be having a direct impact on your business bottom line. If you’d like to find out how we can help you achieve better results, get in touch today! Simply fill in the form below and a member of our team will be in touch. Alternatively, give us a call using the number at the top of the page.