The client

Cromwell is Europe’s leading independent supplier of maintenance, repair and operations type products, with 54 UK branches offering collection services and daily delivery. Their distribution fleet uses a central hub in Leicester to move products overnight, allowing many products to be delivered on the day they’re ordered.



The challenge

Our client approached a number of agencies seeking a solution to an indexing issue which saw just a quarter of their pages being indexed by Google. Impression was the only agency approached to identify the root cause of the issue, this being Cromwell’s use of the React JavaScript framework.

Our objectives in working with Cromwell were rooted in the need to resolve this indexing issue, helping the site to achieve much higher levels of organic visibility than it had been seeing. Not only did we want to solve the immediate problem, we also wanted to ensure that our solution would provide Cromwell with a platform for long-term organic growth.

As such, our primary ‘audience’ was Googlebot. We had to ensure that our work would make the site’s pages easier for search engine to find and crawl in their entirety. At the same time, we had to bear in mind the needs of Cromwell’s end users: professionals in the UK’s building trade.

The strategy

The first step of our strategy was to identify the full scope of the JavaScript issue. This would enable us to propose and implement an appropriate solution. Following this diagnosis, there were several steps to our strategy:

  • Appraise various server-side rendering options, including on-server and off-server frameworks and costings.
  • Consult on correct implementation of server side rendering to facilitate effective crawls (
  • Update site structure to facilitate effective internal linking.
  • Begin to address wider technical issues for optimal search success; indexed search pages, canonicals, pagination, crawl depth, content.

Identify the full scope of the JavaScript issue

Our initial analysis had shown us that the reason Cromwell’s site was not being indexed fully was their use of the React JavaScript framework. The reason for this was that Google could not access the content that was only visible with JavaScript enabled so, while the framework allowed our client to have the functionality they desired in their UI, made it impossible for Google to crawl.

It was therefore essential that we understand the full scale of the problem. We knew that around 25% of pages were being indexed, and we needed to know which, and why. From there, we wanted to categorise areas of the site to allow for appropriate prioritisation of workload and we made our way through the on-site issues.

Using Screaming Frog, we were able to collate a list of all available pages on the site and then categorise those according to the business’ aims, identifying immediately those which could be no-indexed, thus improving crawl efficiency from the start by removing pages that didn’t need to be indexed, allowing more time for Google to access important content.

We built a script that allowed us to pull Cromwell’s current site using Google’s headless browser “Headless Chrome”/”Puppeteer”, so we could view the site as Google’s crawlers did, which enabled us to uncover specific rendering issues within each page and area of the site and educate the client on where problems lay. This also allowed us to see internal linking issues, which could then be addressed.

Implement server-side rendering to facilitate effective crawls

We recommended server-side rendering as a solution to facilitate effective crawls. Server-side rendering means that Google is not expected to render the content for itself, meaning we are in full control of what we present to Googlebot.

This was no small feat, with the site having a large number of pages. We worked closely with our client’s development team to implement this and are very proud of what was a very smooth migration.

Update site structure to facilitate effective internal linking

Most recently we’ve been creating the perfect structure across the site for internal linking and removing duplicate content issues.

To do this, we’ve used Screaming Frog to create a granular sitemap to categorise content across the website and identify which areas of the site were not being crawled and where there was a lack of internal links.

Through this approach, we have been able to identify ‘hidden’ areas and bring them to the forefront through the addition of new internal links, canonicalising or redirecting content where duplication occurred and thus further optimising the crawl for Googlebot whilst improving the overall user experience too.

The following graph shows how organic visibility has grown as a result of our work:

cromwell case study

The results

Search visibility from 26th May 2018 - 20th September 2018 vs previous period. Organic impressions taken from Google Search Console.

increase in organic impressions
increase in pages indexed
increase in organic traffic

Find out what we can do for you

Your digital strategy should be having a direct impact on your business bottom line. If you’d like to find out how we can help you achieve better results, get in touch today! Simply fill in the form below and a member of our team will be in touch. Alternatively, give us a call using the number at the top of the page.

Your data is secure. The connection to this website is encrypted.

Thank you for your enquiry - our team will be in touch as soon as possible.

Get in touch