Get in touch with our team

07.04.2017

2 min read

BrightonSEO: Philip Gamble on Technical SEO beyond the initial audit

This article was updated on: 07.02.2022

Philip from Found discussed how we’re able to continue monitor SEO performance following on from an initial technical on-site audit.

Philip started by saying typically clients will make changes, sometimes that we’re unaware of, and these changes typically bring SEO challenges.

Philip broke this down into three sections:

  • Examples of performance impact
  • How to identify issues quickly
  • How to produce swift resolutions

Philip’s first example was where a client relaunched their website (without telling the agency), and left the the no-index tag in the robots.txt file. As expected, pages started dropping out of the index, traffic reduced, followed by revenue. The agency asked the developers to remove the no-index tag and then submitted all pages via Google Search Console.

The key learning from this was to set up methods of picking up these errors before they impact SEO performance. Tips included: monitoring your robots.txt file, monitoring your meta data and tracking keyword performance on a daily basis, with alerts setup for significant changes.

At Impression we use a suite of tools, but particularly relevant to this situation is OnPage (a crawler) which monitors robots.txt files and alerts us of email changes. We also use STAT which provides daily keyword tracking, with alerts set up for significant keyword changes.

So, now that we understand the impact these changes can have, how can we work to ensure these don’t happen in the future?

  • Demand sign-off from the client, particularly during a relaunch, but also for new pages, URL updates and product categorisation changes
  • Ask for access to any development timelines, plans and ticketing systems if possible so you can preempt some changes will be made
  • Communicate! Agency / Client communication is key, so ensure you’re setting up regularly update calls

Philip rang through a few other examples which you’ll be able to view on his slides below.

A take out from the talk was to work out what you need to be tracking, automate this where possible and set up alerts for changes at appropriate thresholds. Ideas of what you could monitor include:

  • Robots.txt
  • Website uptime
  • Crawl data (404s etc)
  • Page speed
  • Indexation
  • Content changes

For the above you can use third party tools such as STAT, OnPage, Screaming Frog, Pingdom, Google Analytics and Google Search Console.

Philip finished by saying some setup at the start of a campaign will take some investment, but it’ll be worth it in the long run through time savings and better SEO performance.