Get in touch with our team
Feature image for 26.03.2020

26.03.2020

9 min read

Quick Wins To Improve Your SEO

This article was updated on: 07.02.2022

Businesses everywhere are facing a period of uncertainty. The Coronavirus outbreak is impacting almost every aspect of our lives and, unsurprisingly, businesses are taking a hit as consumer behaviour changes. With people spending most of their time at home and forced to do the majority of their shopping online, it’s important to ensure that your digital presence is up to scratch to help you weather the economic storm.

Improving the quality and discoverability of your website has suddenly become a much bigger priority, particularly if you’re a business that has felt the impact of a lack of footfall into physical stores. Although SEO is a far-reaching discipline and tending to your organic strategy should be an ongoing focus, we’ve pulled out some quick wins that you can check and optimise to help give your organic presence the boost it needs.

Check your coverage in Google Search Console

As a priority, you should review all areas of the Coverage section in Google Search Console. Other than the obvious Error / Valid with warning pages, there are a few other categories that you should pay special attention to.

Review any pages which flag up as ‘Indexed, not submitted in sitemap’. The two easiest fixes here are:

– Update your sitemap with any pages that you do want to be indexed

– Apply noindex tags to any pages which should not be appearing as a search result

Take a look through the pages which are ‘Excluded’ too. There will be many pages in here which you have intentionally excluded e.g. ‘Excluded by noindex tag’, but it’s worth checking which pages are falling into categories such as ‘Discovered – currently not indexed’, ‘Soft 404’, and ‘Crawl anomaly’, amongst others. You may well find that some pages that you want Google to index aren’t currently making the grade, so you can look to optimise these pages as a priority.

The “Crawled – not indexed” category is usually a red flag that we pay immediate attention to, as this is likely to be thin or low quality content that Google has decided not to index. With this issue at scale, it can impact site wide organic visibility. It would be worth reviewing this section and deciding if the pages within this section are required to be indexed or not.

Excluded pages in Google Search Console

Review and update your robots.txt file

It’s worth reviewing your current robots.txt file to make sure you have all the necessary controls in place to dictate how you want Google to crawl your site. Particularly if you have a lot of tag pages or user-generated content, look to update your rules as necessary to limit areas of crawl waste. Also, be sure to link your sitemaps here so Google can find the pages you want to index more easily.

Most commonly, the types of pages that are recommended to be excluded are site search, blog tags and any area that’s generating thin content that’s not beneficial to the users. We would always recommend reviewing current organic traffic to areas that you’re planning to block, along with any backlinks to these areas.

Robots.txt file featuring a link to a sitemap

Identify quick win keywords to target

You’ll likely already know which keywords you want your site to rank for but, right now, you’ll need to prioritise. Finding the quick win keywords that you’ll be able to make the most headway with is a good place to start before making any on-page changes.

We’ve created a tool that will give you a quick overview of where your biggest opportunity lies – all you need is access to AHrefs (offers a cheap 7 day trial) and your Google Search Console account (free). This tool uses information about the search volume, keyword difficulty, and your current ranking for each individual keyword to assign a score between 1 and 4. 1 being the lowest opportunity, 4 being the highest. Straight away you’ll be able to see where it’s worth concentrating your efforts to make the quickest gains.

Keywords ranked by opportunity using Impression’s spreadsheet tool

To use the sheet simply export a list of your current rankings from Ahrefs, along with an export of your search queries from Google Search Console, and import them to the relevant tabs. The ‘Opportunity’ tab will then auto-populate with all the data you need and will rank each keyword by the opportunity it presents.

To access the spreadsheet, you can view the template here and make a copy for your own use.

Review title tags and H1s

Title tags and H1s are some of the most important on-page aspects to optimise when trying to rank for your target keywords. Now is as good a time as ever to make sure every page you want to be indexed has a unique, keyword-targeted title tag and H1 that accurately describe the purpose of the page. This helps to demonstrate to Google the relevance of the page that you want to rank, as well as helping to avoid any keyword cannibalisation issues. If you have identified some quick win keywords to target, then you can use this research to prioritise which pages to optimise first.

Keyword targeted title tags for Codecademy pages
Screenshot of webpage outline taken using the Web Developer Chrome extension

Receiving links from other websites is great, and can be a powerful signal of your website’s expertise, authority, and trustworthiness. However, if you’re receiving a lot of links from poor quality, spammy websites, it could be tarnishing how Google views your site’s reputation. Tools like Ahrefs and Kerboo are great for identifying these bad links that might be harming your site.

Distribution of backlinks segmented by URL rating from Ahrefs

A few examples of poor placements that you might want to get rid of:

– Paid for/sponsorship placements

– Low quality guest blogging

– Low quality directories

There are many more and additional examples can be found here – provided by Google.

Once you’ve identified the links you think are harmful you can submit these via Google’s Disavow Tool, which tells Google that you don’t want your site to be associated with them.

Audit your page speed

Page speed is a known ranking factor and, although you might not think it, there are some quick changes you can make to improve your load times. Often page speed issues relate to imagery, so make sure to use a suitable image compression tool and that you serve your images as efficiently as possible. Run your pages through the PageSpeed Insights Tool to get bespoke recommendations for your website.

Opportunities to improve page speed taken from Google’s PageSpeed Insights Tool

Not only do broken links affect the quality of experience that users have when using your website, they can also impede Googlebot from successfully crawling your site. Tools like Screaming Frog and DeepCrawl will quickly identify instances of 404 errors for you to fix. You should also look to update any crawled links which are 301 redirecting – you can find the location of these links easily in Screaming Frog by navigating to the Response Codes area, then viewing the Inlinks tab.

Inlinks with 301 status codes identified via Screaming Frog

Update internal linking to showcase high performing products

If you know you have products that sell well, be sure to highlight these on your website. Consider adding a ‘Bestsellers’ page or simply update your internal linking to showcase these products more prominently.

To work out which hidden gems you might be sat on, take a look at the Product Performance tab in Google Analytics to see which items perform the best. If you want to be really smart about it, match up the URLs of these products with click depth data from Deepcrawl to see which products are buried deep in your website. We’re working on a tool that will do the heavy lifting for you on this, so check back for an update soon!

Finding CRO opportunities using a crawler

Custom extraction using XPath allows you to pull in additional data about each of the pages that you crawl. This can give you really valuable insights to help improve the conversion rate of your existing traffic. For example, you can spot which of your products are out of stock or only available for pre-order, so you might not want to focus your efforts on optimising these products. Marrying this data up with information on which pages have the most traffic and engagement will allow you to see where the biggest demand from your customers is, and you can rise to meet that demand before your competitors do.

Product availability details crawled using custom extraction in Screaming Frog

There are loads of use cases where custom extraction can provide you with additional information to make data-driven decisions about which products or services you prioritise in the short term. To learn more about this, we really recommend checking out Luke Carthy’s SearchLove presentation on Finding Powerful CRO and UX Opportunities Using SEO Crawlers.

Conclusion

Ultimately every business will have to make a unique judgement call about how to handle the next few months. All of the actions we have discussed were already important before the Covid-19 outbreak, but especially so now. We hope that using the techniques above, you’ll be able to access valuable data to make informed decisions about how to improve your organic presence and to make intelligent tweaks to your website to best serve your users needs.

Let us know in the comments if you have any other recommendations for spotting quick wins over the coming weeks!