Get in touch with our team

Podcast

SEO Testing – RankUp with Nick Swan

Subscribe to RankUp on Spotify | Apple Podcasts | Google Podcasts | RSS

SEO testing is the process of measuring to see whether changes made to a website have made a significant difference or not. It’s always been possible if you’ve had access to good data and some time on your hands, but tools are starting to appear that automate the process.

We spoke to the creator of one of the first and only dedicated SEO testing platforms, Nick Swan, who launched and runs seotesting.com. The tool uses Search Console data to measure the significance of changes over a 2-6 week period after a set date, and can also accommodate A/B tests.

In this episode, Edd and Ben wanted to get to the bottom of what SEO testing means, why it is important, and how to make sure the data you get is useful. Join us as we hear the answers from one of the industry’s leaders on the subject.

As always, you can listen to the full conversation right here on this page, or on your podcast player of choice. If you’re tight on time and just want to catch some of the highlights, keep reading for a transcript of some of Nick’s key points. Listeners of the podcast can also use this link to access an extended free trial for seotesting.com.

If you want to follow Nick to hear more about SEO testing and the rest of his work, you can find him on Twitter at @nickswan. Twitter is also the best place to find the rest of the RankUp team: Edd (@EddJTW), Ben (@BenJGarry) and Liv (@seoliviamae).

Introducing Nick

Ben: How did you get to where you are in SEO today?

Nick: I’ve been doing SEO since about 1998. Back then, I was about 17/18, about to go to university. My older brother had a business selling car parts. I was already getting into computer programming, so he said to me, “Build me a website and get me on the search engines, and I’ll pay your accommodation rent for you.”

So that’s what got me into buliding websites – just in HTML, with keyword stuffing and all the things you could get away with in those days. It was good fun. SEO was easier back then, and link building was easier, but a lot has changed since then.

While at university I studied software engineering for four years and did a few jobs, then eventually co-founded my own business selling extensions and tools for Microsoft Sharepoint. I was building the tools, but also doing SEO for the business.

For about 20 odd years I’ve always been involved in SEO in one form or another, but after leaving that business in 2013, I set up a voucher code website. I knew someone who worked in the industry, who told me that the sites ranking top were ranking into all of these black-hat practices of buying links, spinning content – everything you can think of – so we took the view that they would get whacked with penalties in the near future.

The hypothesis we had was that if we started a voucher code website and did everything white-hat, then we could hopefully do quite well once these sites got whacked. It took a couple of years for the site to get up and running, but once these sites took penalties we did quite well for a year or so. But then it’s interesting, because Google hasn’t really paid much attention to the voucher code sector for a long time, and a lot of churn and burn sites started to come into the rankings.

You’ll also notice that any site with authority now has a voucher code section. Even sites like NME, which used to be a music magazine, have now got a voucher code section. So I don’t really play in that space anymore, but what got me started in what I’m doing now was that when the voucher code site was doing well, I was doing lots of click through rate testing with page titles and meta descriptions.

From dabbling in SEO testing to a dedicated tool

Nick: If you’re ranking in positions one, two or three and you can boost your click through rate by 1% or 2%, it can make a big difference to monthly revenue. So I started tracking the changes and seeing what the difference was in clicks and position each day.

Once you get past a few tests that you’re running, it takes quite a long time to look everything up in Search Console so, being a software engineer and knowing that there’s a Search Console API, I tried to automate it.

So really, the tool came from scratching my own itch of not wanting to sort through Search Console data manually in spreadsheets. At the time, Search Console also only had three months of data available, which is paltry, so the first version of the tool was also about archiving older data and running tests, and it was called sanitycheck.io.

Just before the start of the pandemic, I did a whole bunch of customer analysis on what people were using Sanity Check for, especially the customers that stuck around, and a lot of it was testing, so I decided that testing was something I wanted to focus on.

From a quick search I found that the domain seotesting.com was available, so I gambled, paid a few thousand dollars for the domain, and rebuilt the tool from the ground up.

I released it in April 2020 as a free Beta because businesses were struggling with the pandemic, and started charging in July 2020 once things calmed down a bit, and I’ve been working on it since.

What is SEO testing?

Nick: It’s about making a change to a page, or maybe the whole site, and then tracking the results of those changes in organic search. We’ve all been doing it all anyway. Even back in ’98, we were doing it in the sense of optimising something to try and make it rank better in a search engine, so we were making changes and monitoring them in rank trackers or Google Analytics.

What we’re trying to do with SEO testing now is have a more structured approach to making those changes and tracking the results. It’s now about making it automated and easier to track the results to see how our changes affect the website and our rankings.

Types of test

Nick: There are two types of test you can run. There’s time based testing, which is where you make a change and compare two different time periods. So you’ll have a period of time before the change compared to the performance for the same period of time afterwards – four to six weeks, something like that.

Hopefully, you’ll see an uplift in clicks, click through rate, impressions, or whatever it is that you’re hoping to improve.

Then there’s also split testing, which is still time-based in SEO. You’re still comparing one period to another, but you’re now comparing two groups of pages: a control group and a test group. Again, you get the data for both groups of pages, then compare that to data for both groups after a change has been made to one of them (the test group, not the control group). It’s a bit more complex to set up a split test, but it will give you a more accurate result.

The difficulty with time-based tests is that there’s so much going on in Google Search that we can’t control, from seasonality to machine learning algorithms. With split-testing, the results will be more accurate, but it’s harder to set up.

Things to consider when running tests

Edd: How do you approach creating a test for SEO? Do you have any top-level guidelines for SEOs who want to get involved in testing?

Nick: Split tests work particularly well for sites that have lots of pages of the same type, like products or category pages. What we would do is take 20 product pages, for example, and put 10 in a control group that won’t be tested, and the other 10 in our test group, which we’ll make changes to. Then we’ll compare the two groups over time to see how they perform.

More generally, it depends what you’re testing for. If you want to look at pages that have low traffic but might still be quite valuable, we can test around the average position for the page for a particular keyword, and also for click-through rate. But then, for high traffic pages, you can look directly at clicks.

When looking at the test period, if all of our metrics are green, then that’s good, as long as we’ve accounted for other factors like seasonality and algorithm updates. But sometimes clicks will go up while average rank goes down because you’ve started ranking for more keywords, or the click through rate goes up because the impressions go down. When you get mixxed signals like these, you have to look into the data to reach your conclusion.

What should you test?

Ben: In an ideal world, should SEOs be running tests on pretty much everything they can implement or change?

Nick: For anything on-page, you can set up a test to track it. I’ve spoken to customers about how they use the tool, and some people have built tests into their standard operating procedure for their content teams, for example. Even if they don’t go in and look at all of the test results, it creates a record of the things they’ve changed on the site, while others are checking tests and iterating on them straight away. So really, when it comes to on-page, I would say to test anything.

There are also certain things off-page that you can run tests around, such as when you get new links, run a campaign, or gain links to a specific page. Even if you’re running traditional PR and have gained a lot of brand mentions, you could test when those are published to see if there’s an uplift in branded traffic. By measuring the test over six weeks or so, you’ll be able to see if those campaigns led to increases and improvements for other areas of the website, and so on.

SEO testing on different size sites

Ben: Is SEO testing better or worse suited to different kinds of sites or industries? Is there a sweet spot, or can any site in any industry be tested to some extent?

Nick: I think any site in any industry can be doing time-based testing, at least, where they take one individual page and create a test to keep track of results.

I do see a lot of content-heavy sites using the tool – a lot of affiliate-based sites, publishing sites, that sort of thing. They’ll take an individual page, refresh the content, then create a test around that URL to track results.

Split testing is more suited to template-based sites, so ecommerce, estate agents, auction sites, listing sites…anything that uses a template across product pages or category pages. If you want to test something, you can create a test around a small batch of those pages to see how the changes affect them. If you see improvement, you can roll those changes out to all the product pages. These smaller tests mean you don’t have to run the risk of hurting the entire site by rolling everything out. If you see a positive result, you’ve got more confidence to roll the change out to a bigger group of pages.

Ben: Are there any practical considerations when making these changes?

Nick: If you’re just getting started, you can do what I did back in the day and make manual changes with manual test monitoring, using Google Search Console and a spreadsheet to track the difference in clicks, impressions and so on.

If you still want to do manual changes on the site but want to automate the collection of the results, that’s where seotesting.com comes in.

There are now other tools that will use a CDN or JavaScript library to manage the changes for you, and then track the results. So it depends on the approach you want to take – whether you want to install something else on your website, or manage it yourself.

Try seotesting.com and join the conversation

Nick has kindly given listeners of the RankUp podcast an extended free trial for seotesting.com! Use the link https://seotesting.com/partner/rankup to get a 30 day free trial instead of the usual 14 days.

To hear all of the content from Nick’s interview, listen to the podcast episode using the player at the top of this page, or find the RankUp SEO Podcast channel on your podcast app of choice.

Edd and Ben will be back soon with a new episode of the RankUp podcast. In the meantime, you can find us on Twitter at @BenJGarry and @EddJTW.

If you’re interested in being a guest on the show, please reach out to us on Twitter or via email.