After a particularly busy month in March, April was far quieter in terms of search industry updates.
Google published a blog post detailing how it had seen a 20% increase in site speed in 2018 among the slowest ⅓ of websites in its index, indicating that webmasters were responding positively to the search engine’s recommendations in 2017 to do so for ranking purposes.
Google also published a new report in Search Console to help SEOs better understand how searchers find content on their site through Google Discover. This new report provides greater insights into the opportunities available for increasing organic traffic without relying on search queries.
On April 5, many SEOs reported cases of Google de-indexing or removing pages entirely from its index without just cause, an issue that was later revealed to be the result of a bug in its index. Google consequently paused all Search Console reports until the issue was resolved on April 27.
Finally, the search engine giant also confirmed that it had implemented a broad core algorithm update on April 16. John Mueller, a prominent Webmaster Trends Analyst at Google, revealed that the algorithm update aimed to ensure that searchers continued to get the best answers for their queries, highlighting the importance of overall relevancy as a ranking factor.
When Google announced that page speed was a ranking factor for mobile searches in 2018, the world of SEO took note. Since then, Google took to their Webmasters blog in early April to recognise those who have worked to increase page speed. Key highlights included:
- An increase to 20% for user-centric performance metrics for the slowest one-third of traffic in 2018 (compared to 2017 where no improvements were seen)
- Over 95% of countries improved their speeds, seeing improvements across the whole web ecosystem
- 20% reduction in abandonment rate for navigations initiated from search
- Over a billion PageSpeed Insights audits ran for over 200 million unique URLs in 2018.
Google Search Console Adds Discover Report
Google announced on April 10 that it had added a new report to Search Console to help publishers better understand how searchers find content on their site through Google Discover. Google said that it hopes “this report is helpful in thinking about how you might optimize your content strategy to help users discover engaging information– both new and evergreen.”
What is Google Discover?
Google Discovery is a feature that helps users find breaking news on their favourite topics, such as sports or TV, without needing a query to find it. It was introduced in September 2018 as an update to the existing Google Feed that was launched the year prior, and is currently used by more than 800 million people every month. Users can access Discover via the Google app or the Google.com mobile homepage.
Search Console now includes a specific report that provides new information for webmasters to digest on the traffic they receive from Discover.
The Google de-indexing bug
Around the 5th of April, many SEOs were reporting Google de-indexing or removing web pages from their index and consequently not being displayed within the SERPs. For the majority of these websites, these pages helped generate conversions and were essential to generating revenue.
With Google finally addressing this indexing bug a couple of days after the initial chatter, they advised many SEOs to use the URL Inspection Tool to speed up re-submitting these pages into Google’s index, however this was not the most practical result for sites that had lost thousands of pages.
On April 15, Google announced via Twitter that Google Search Console was suffering from this mass indexing issue and would need to pause all index coverage and enhancement reports until the issue was resolved. Without these reports, and the URL Inspector Tool now potentially not reflecting the true live status of these webpages, many SEOs were left in the dark about the status of their pages. Search Console data was only updated on April 27, leaving many webmasters without up-to-date information on their site’s indexation status. With the data now complete once again, SEOs are now able to tackle their index coverage head-on, although many still feel that Google could have provided greater clarity when the issue was still ongoing.
Despite this issue now seeming resolved by Google’s engineers, we have now learned that data from Google Search Console over the de-indexing bug period of April 9 to April 25 could be considered as unreliable. This is due to manual actions disappearing from Search Console without any reason. If you previously had a manual action appear on your Search Console and it has not reappeared throughout this bugged period, it is likely that the manual actions are still causing an impact within the SERPs. So, can we rely on this data? Keep an eye out for any inconsistencies within this data and take anything with a pinch of salt.
Mid-April Search Ranking Algorithm Update
It was confirmed on the 16th of April that Google had rolled out another broad core algorithm update. Similar to the algorithm update that was released in March. Though many SEOs hypothesised that the algorithm update targeted low-quality pages, John Mueller confirmed that the updates as of recent focused on the relevance of pages.
“[It] might be something where we don’t think your site is exactly relevant for those specific queries. It doesn’t mean it’s a bad site, it’s just not relevant for those specific queries. So that’s something that just happens to a lot of sites overtimes. They might be really high-quality content but, over time, they’re just not seen as relevant in the overall picture of the web.” – Mueller
In short, it would appear that Google is adjusting websites’ rankings to ensure that searchers continue to get the best answers for their queries. In the Google Hangout, Mueller recommended getting objective user feedback on what your site could be doing differently to improve its user experience and targeting.
For example, if the ranking of a long-form guide on your site has seen a significant drop in rankings, this is not to say that it is of poor quality. It means that perhaps the long-tail search queries that you have crafted your heading hierarchy around might not be well targeted to the questions that your content answers. In this respect, it would be ideal to edit the titles and headings of your content to ensure that users get the exact answers that they want from your content – rather than simply targeting long-tail search queries with the highest search volume.
Alongside ensuring that your content is relevant to the search queries that it is targeting, Mueller also recommended checking the technical details such as Google’s ability to crawl and index all of your content properly. As these important parts of ensuring that your site ranks can be impacted when you make small changes to your website.