Get in touch with our team
Feature image for 20.06.2019

20.06.2019

6 min read

Search Leeds: Rory Truesdale – Intent optimisation – why it matters and how it can improve your SEO results

This article was updated on: 07.02.2022

Rory is interested in understand search intent and using it to inform his SEO campaigns.

So why does search intent matter? Classic information retrieval is based on people searching for information. But people don’t always use a search engine to find information. Even back in the late ’90s, search engines evolved to account for this.

More recently, Google’s quality guidelines break intent down into know, do, site and visit.

But there’s a problem. These buckets are too broad to be meaningful. A while back, Kane Jameson created a much more detailed model, including categories like video, news, answer, local, transaction and many more. This is all based on SERP features.

It’s now much easier to classify the intent of a SERP quickly.

Why is this useful? When we look at the SERPs for a term like ‘coworking space london’ we can see all sorts of informational pages (about 8/10).

For ‘office space london,’ the intent shifts completely to a transactional intent. Being aware of this means you can tailor your campaign, targeting landing pages to the right type of keywords.

So, every searcher has a specific intent and Google wants to align results to this intent. Matching pages to intent will help you see more traffic and a higher quality of traffic, bringing more engagement and conversions.

There are three pillars in applying this to a campaign.

  1. What intent is Google looking for?
  2. How are competitors targeting it?
  3. How can you improve your site?

One step is to use a tool that can analyse SERP features for you (Impression note, we use Ahrefs or the Moz keyword explorer for this).

If you can export SERP feature information you can start mapping landing pages on a keyword level basis. But this isn’t the only way to analyse intent.

Analysing the language of results pages can be even more powerful.

Start with a custom search engine. For each of your most valuable keywords, use cse.google.com to set up a custom search engine and extract the top 100 ranking URLs using the SERP Redux bookmarklet.

In the settings for your CSE, tell the search engine to only use the top 100 URLs for your particular queries. This creates an interesting sandbox. It means that you can look at the performance of these pages in isolation. It also means that pages can be assessed by relevance in isolation of other factors.

This environment means that if your page is ranking higher in the CSE, you need to improve your technical performance. If it ranks lower in the CSE, you need to improve your relevance.

Within the CSE, analyse the language in metadata to see if any phrases commonly occur. In the office space example, Rory found that the word ‘rent’ appeared commonly. He realised that this was an intent modifier. With this information, he could also carry out a proximity analysis to see how closely ‘office’ and ‘rent’ appeared near each other.

This is important because Googe cares about proximity. If it sees patterns in how closely specific words appear together, it will start to form a sense of the relationship between those terms. With a CSE, we can carry out our own analysis and identify proximities.

To recap, a CSE gives you a controlled environment in which you can analyse performance based on relevance and carry out tests to reveal patterns.

The next workflow is custom SERP extraction. SERP extraction allows us to analyse page copy at scale. You can crawl and extract the data from pages in SERPs. This might upset John Mueller, which is a little bit in violation of Google’s terms of service. But if you use proxies…it’s technically not you doing it. Sort of. And remember, Google’s scraping sites as well. This is revenge…but good revenge.

To do this, all you need are SERP URLs, which follow a pattern. Rory has a tool that does this for you. You can upload your SERP URLs into Screaming Frog and pull both page titles and meta descriptions. The XPaths involved to change from time to time, so check them if you see something not working. The Chrome scraper extension makes it easier to get the up to date XPath.

The result is to get the top 10 titles and descriptions for each of your top 10 results. Rory has built a formula to look for the presence of certain words that might indicate intent. When these words pop up, the formula will assign an intent. This will need to be modified for your clients but it does work.

The final piece of the puzzle is another formula that finds the most common intent within the SERPs you’ve analysed.

This goes beyond features to identify the language of the SERP. It will help you ensure that your landing pages are completely aligned with search intent.

The final workflow involves analysing the displayed meta descriptions. A recent study showed that Google rewrites these descriptions 92% of the time. The fact that Google rewrites descriptions makes Rory think that Google is trying to improve click through rate. Analysing the descriptions that Google has picked will show us the terms that Google thinks will drive engagement.

Rory used a market share report to pull the top 30 meta descriptions from his chosen keywords. There are a few ways to do this. He then put the descriptions into the Cleanse Stopwords tool, to remove the most common, irrelevant words. He then put the cleansed text into an ngram analysis to look at co-occurring words. This reveals the words that appear together most often.

Why is this interesting? Again, it’s a deeper dive into SERP language and it’s a deconstruction of what Google’s choosing to show on its results pages. Running a simple linguistic analysis gives us obvious clues into what people want to see. You can incorporate these phrases into your metadata, copy, internal linking structure etc.

Finally, Rory wants to touch on phrase-based indexing. This means that Google can rank documents based on the number of related phrases. These phrases can be really useful for making sure that user experiences are meaningful and they can directly influence ranking performance.

Co-occurrence analysis at scale, in the SERPs, can help us pinpoint what users are looking for. This can help us to improve our engagement and ranking positions. If we can find patterns in rankings and co-occurring terms, we’ll end up with really meaningful insights.

Rory is absolutely confident that this works from the results that he has seen.

Key takeaway:

  • SERP language analysis can provide super useful insight into what Googe and users are looking for.