September was a busy month for Google and the search industry. We had a Google search algorithm update, Ahrefs released a new free tool and there were plenty of other announcements from Google which have affected the search landscape and digital marketing industry.

Read on for a breakdown of the biggest updates this month and what this means for you.

September Google Search Algorithm Update

There were signs of a Google search algorithm update on the 23rd of September. What we saw was a rise in the number of unrelated results for queries, particularly on mobile. Some sites also saw a significant drop in organic traffic and rankings.

In the below screenshot from SEMRush, we can see that there was a spike in SERP volatility around the 23rd and 24th of September.

Source: SEMrush Sensor

Indexing bugs

During the same time as the Search algorithm update, Google confirmed that there were two indexing issues that had impacted a few URLs (0.22% of their index). The two issues, outlined below, meant that some pages were not indexed for a period of time and were not appearing in Google’s search results. Or if they were appearing, a completely wrong URL was being shown in the search results for a certain query.

  • The first issue involved mobile-indexing where, if your page had previously been indexed, it might have been not appearing at all in Google’s mobile search results (affecting 0.2% of Google’s index)
  • The second issue involved canonicalisation where Google was selecting completely different URLs for pages that already had a canonical tag (affecting 0.02% of Google’s index)

In the below Twitter thread, Google Search Liaison claimed to have fixed the issue on October 3rd 2020.

Source: Google SearchLiasion’s Twitter

What this means for you

  • You or a client might have seen your rankings and traffic take a steep drop during the last two weeks of September which might have been influenced by Google’s indexing bug
  • We would always recommend reviewing yours or your client’s Search Consoles on a regular basis to monitor any significant changes
  • In this instance, we’d recommend checking your Impressions around the 23rd and 24th September to see if there were any spikes or dips during this time – especially if you’re in the midst of monthly reporting!

Ahrefs launches free Webmaster Tools

Mid-September, Ahrefs launched a free version of their Webmaster Tools (AWT). This tool gives webmasters free access to Site Explorer & Site Audit, but now new functionalities that a paid member of Ahrefs would not already have.

In Ahrefs words, “there aren’t any new features in AWT at all – we’re just making some parts of Ahrefs available for free.”

The free SEO tool gives access to the keywords that your website ranks for, backlinks to your site, 404 pages, an audit of your internal links and page structure,

For existing customers, Ahrefs has outlined the benefits that using AWT has for them, including;

  • Unlimited number of “verified” projects on your dashboard
  • Additional “per project” limits for verified websites (such as additional 5,000 free monthly crawl credits)
  • Lite members can see the keywords their websites are ranking for in the top 100 positions (rather than just 20), with verified ownership
  • Javascript rendering available for Lite and Standard plans
  • Google Search Console verification

What this means for you

  • When Ahrefs announced the launch of the tool, there was plenty of controversy and discussion on its implications and the ability to aggregate your site’s keyword ranking data, to the benefit of paying customers
  • We would recommend reading this review of Ahrefs Webmaster Tools by Internet Folks before using the tool

Advancing Natural Language Processing with pQRNN

In the last decade, Natural Language Processing (NLP) and other speech applications have been transformed by Deep Neural Networks. With machine learning ever-evolving and learning to meet new demands within the market and to better understand consumers, Google AI released an extension to their PRADO model.

The PRADO model architecture has the potential to learn clusters of text segments from specific words as opposed to word pieces of characters. The pQRnn PRADO extension includes; a projection operation with the encoders fast processing, a dense bottleneck layer learns and attempts to understand every single word representation relevant to NLP tasks and a stack of QRNN encoders to learn contextual representation from the input of the text without employing the processing. A visualisation can be seen below.

Source: Google AI Blog

The new pQRNN architecture has been reported to nearly achieve BERT-level performance despite having 200,000 less parameters per token and trained in only supervised data. To view the research behind this NLP model, Google AI has open-sourced their PRADO model on their Github here.

What this means for you

  • As the NLP model continues to learn and further advances, this is an exciting opportunity for those who work in search
  • The pQRNN model looks to work to protect and maintain user privacy, eliminating network latency and heavily reducing overall operational costs

Google Adds a “Pickup Later” Feature For Local Inventory Ads

With the vast shift in many consumer behaviours as we enter the “new normal”, Google has seen a rise of people researching local shops for the availability of stock prior to their trip. It’s clear that consumers are looking for real-time and accurate information during this time and enhance the safety of those attending these stores and restaurants.

To provide a more relevant experience for users, Google has started to roll out “pickup later” options for local inventory ads, helping you to promote products that aren’t available at this moment in time but are guaranteed to be on a certain date.

Source: Google Ads & Commerce Blog

What this means for you

  • This can help a range business, whether you’re a small business or have a global presence, Google is looking to new updates to help fulfil their user’s needs.

New Schema to support for shipping data for e-commerce sites

Google announced in September that it would not support shippingDetails schema.org markup in Google search results. This new structured data allows webmasters to mark up their products with shipping information, including cost and expected delivery times, and achieve rich results.

Google came to this decision after carrying out research showing that users leave the checkout process if they find unexpected shipping costs. The new structured data is designed to tackle this problem and help retailers convert customers more easily.

As Google said in the blog post announcing this development:

Retailers have always been able to configure shipping settings in Google Merchant Center in order to display this information in listings. Starting today, we now also support the shippingDetails schema.org markup type for retailers who don’t have active Merchant Center accounts with product feeds.

What this means for you

  • If you are an online retailer, this new structured data markup will help drive more clicks to your product pages, as well as avoiding having customers leave once they read your shipping information.

Googlebot will start crawling sites over HTTP/2

Google also announced last month that it would begin crawling some sites over HTTP/2. HTTP/2 is the next iteration of HTTP, the protocol used on the internet to transfer data, and this latest version is “more robust, efficient, and faster than its predecessor, due to its architecture and the features it implements for clients (for example, your browser) and servers”, according to Google.

Google will now begin crawling over HTTP/2 for eligible sites that would benefit from the advanced features of the new protocol, such as request multiplexing. This new feature essentially allows Google to crawl websites more efficiently by transferring multiple files at once rather than establishing multiple connections.

If your server still only works on HTTP/1.1, you will not experience any drawbacks in terms of crawling. The new protocol version is not mandatory.

What this means for you

  • If your website supports HTTP/2 and would benefit from being crawled over the new protocol, Google may decide to start using this new version.You do not need to do anything further.
  • If, for whatever reason, you would not like your site to be crawled on HTTP/2, you can set up a 421 HTTP status code every time Googlebot tries to crawl your site over HTTP/2. There is no explicit reason for doing this as there is not evidence that your website would suffer from being crawled on the new protocol, but you can exclude these requests nonetheless.
  • If you would like to upgrade your server to support HTTP/2, you will need to speak with your server administrator or hosting provider.
  • If you would like to test if your server is eligible for HTTP/2, read this blog post by Cloudflare.
  • Once your site starts being eligible for HTTP/2, you will receive a notification on Search Console letting you know that some of your crawling traffic will be done on HTTP/2.

Has your website been affected by the latest Google search algorithm update? Learn more about the SEO team’s Penalty and Broad Core Algorithm Recovery services today.

Olivia-Mae Foong

SEO Executive

Liv is an SEO Executive at Impression. As a former Fashion and Beauty PR, she now manages the SEO strategies for a few of our clients whilst supporting the Specialists and Strategists in executing strategies for our larger accounts.

Olivia-Mae has specialist knowledge in SEO and On-Site Content.

Leave a Reply

Your email address will not be published. Required fields are marked *