Imagine your website ranks 3rd in its respective SERP (search engine results page) vertical. Your page visits, conversions, and sales are through the roof, and achieving your long term business goals are in sight; it’s all blue skies and sunshine on the horizon. But then overnight, *ominous music plays* Google unveils a new algorithm update, within 24 hours your site falls from grace, and you can’t find it on the SERPs. You’re devastated, with crushed dreams and no flicker of hope for online presence recovery. ‘But wait! There is hope after all’, your SEO specialist declares confidently. They explain that is in fact not the end of the world as you know it but simply as a result of a Google core algorithm update. With a few SEO strategy changes that correlate with the new algorithm, recovery is possible, ney, achievable.
This scenario isn’t quite as fictional as one may think. In fact, Google openly states how it typically releases one or more algorithm changes every day to improve its user experience. Some are major (perhaps not as dramatic as the aforementioned scenario…), but most are usually very minor. However, as digital marketers, it’s important to have a holistic view of what’s going on over at Google HQ.
There are several resources online that allow you to keep up with Google algorithm updates. At Impression, we do this by following Google Search Liaison on Twitter, reading Moz’s Google Change Algorithm History resource and by periodically checking in on Algoroo. While keeping up with these resources may help you navigate new algorithm changes, here are five recent updates you might have missed.
1) Mobile-first index; the move to mobilisation
Smartphone browsers, including those from iPhone and Android, can interpret a broad range of HTML5 and other code specifications, with the most significant differences between standard desktop display being a smaller screen and a vertical landscape orientation. Since the release and mass adoption of the Apple’s iPhone in 2007, mobile SEO optimisation has become increasingly important for all websites. Common practice strategies to accommodate this development, and adapt to Google’s mobile-first mentality, include (ideally) updating to a responsive layout that will conform to any device display and taking the mobile-friendly test for confirmation that your site is up to scratch. In 2015, mobile search queries exceeded computer-based searches with mobile commerce transactions following suit. As more users than ever are turning to their phones instead of their desktops, Google has taken note and proceeded with the following updates to best follow the trend.
In 2014, Google began labelling sites that were mobile-responsive, in an attempt to enhance the on-the-go users’ experience. This meant that next to every result in a vertical SERP, one could easily identify which sites could be accessed with the same level of usability on a non-desktop device. However, not even 2 years later Google took away this feature, as over 85% of pages in the mobile search results updated and improved their mobile-experience, and in turn decluttered organic searches. As mobile-friendly labelling was Google dipping their toes in the pool of mobile-optimised search algorithm possibilities, more mobile-focused updates were soon to follow and destined to create a splash (Duh, duh, duhhhh).
In 2015, Mobilegeddon (aka mobilepocalyse, mopocalypse or mobocalypse), was the ominous and exaggerated epithet referring to Google’s announcement and subsequent move to use mobile-friendliness as a ranking factor in respective mobile SERPs. This wasn’t a shock to professionals on the pulse, but it did mean that websites large and small had to optimise – and fast. The update was said to impact only ranking potential on mobile searches, only be applicable to individual pages and not entire websites, as well as be released globally covering all languages. Digital marketers and webmasters across all industries were to be affected, whether their sites dealt were clothing and e-commerce or even a Chuck Norris fan page; Mobilegeddon was afoot.
Yet, most can agree that it wasn’t, in fact, the end of the world as prophesied. Google gave professionals ample time to update their sites and get on the bandwagon, creating a mobile-friendly checker tool so that you could evaluate your site, even giving tips and tricks on how to optimise further. The update itself was rolled out slowly, not really coming to fruition until several months, as to not break the internet by disregarding all previously indexed desktop-friendly sites.
The mobile-first index means that Google is going to begin crawling, indexing, and ranking sites based on their mobile-friendliness, instead of a site’s standard desktop experience. The mobile-friendliness of a site goes beyond the information architecture to include factors like mobile page speed, load times, internal links, and mobile-optimised dynamic elements that must also be considered. To see how this is affecting your page, monitor your log files to see crawl traffic for Smartphone Googlebot traffic spikes, as well checking if the cached version of the site is the mobile-friendly version, are good indicators. You will also be informed by Search Console whether mobile-first indexing has been enabled for your site.
Google’s new filing system doesn’t mean that it will only index pages that are mobile-friendly, but rather rank them higher than sites that are not updated to include a mobile-experience, while slowly under-rewarding desktop-only sites that don’t get with the program. Think of it as ‘spring cleaning’. What do you do when you find something you haven’t worn for a year and is now out of style? The same as when Google finds a site that hasn’t been updated in a year and doesn’t conform to mobile display; throw it to the back of the closet and hope it comes back in style (but in this Google’s case…becomes mobile-friendly).
2) Panda; Command with Creative Content
Since Google’s Panda Update in 2011, a shift in valuing creative and comprehensive content over pandering (or should I say panda-ing) and keyword-focused content has been underway. This algorithm adoption penalises sites that feature thin or lacking, duplicate, and machine-generated content. To appeal to this algorithm, website content should instead focus on quality and substance, differentiation from similar sites, and be careful not to commit keyword cannibalisation and over-optimisation techniques. Unique content doesn’t just help your page rank, but also increases user understanding, decreases bounce rate, and ultimately leads to more trust as well as the likelihood of being shared on social media.
3) Hummingbird; Google’s Search for meaning in a world of keywords
In 2013, Google’s Hummingbird Algorithm introduced semantic searching, beginning their long-winded transition from keyword-centric searching to a more complex search for meaning (how deep, Google). This manoeuvre of trying to discern user-intent implicitly beyond explicit keywords doesn’t discredit keyword research as an SEO strategy, but merely pushes it down on the list of priorities for any digital marketing professional, and demotes its usefulness to more of a guide. Upon Hummingbird’s launch in 2013, Matt Cutts, Google’s then head of the webspam, stated it was to affect 90% of all Google searches with the incorporation of many updates, most notably being Penguin.
You can read more about Google Hummingbird here.
4) Penguin; Link Quality over Quantity
Before Google’s Penguin update in 2012, ‘link schemes’, aka backlinking in bulk to manipulate PageRank, could successfully get a web page to the top of the SERPs. This meant that people could essentially pay their way to the top of Google using cheap and unrelated links, but this is no longer the case. Today, sites that are caught partaking in this scam will have harsh penalties including being dropped from SERPs with a steep ladder to climb back to their previous standing. Acquiring quality links is now more in-line with general promotion and digital PR methodologies, and focusing on building real relationships with publishers, as well as aids in research where your audience visits across the web, contributing relevant content to authoritative and relevant domains.
Penguin 1.0- webspam algorithm update
In 2012, Google was already fighting a battle against low-quality content and launched an update to penalise and prevent the malpractice of increasing a site’s rank through blackhat link-building. Matt Cutts, explained, “It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that.” It was only after the update that the algorithm change was named ‘Penguin’, and despite multiple theories, it hasn’t been publicly stated where the name originated from (Darn Google and their secrets…). This initial sweep affected only 3% of all search queries.
The second update rolled out about 1 year later and was a more technical approach that audited links in deeper pages. The webmasters who found loopholes in the initial Penguin sweep got the full brunt of Google’s second sweep retaliation. This update reached 2.3% of all search queries. Webmasters:1, Google:2.
In 2014, Google ran a reset of the update to allow recovery for those who had adapted, and misery for those who had previously gotten away with the malpractice. This update affected less than 1% of search queries and was relatively non-impactful across the web.
Finally, in 2016, the final and largest Penguin update took flight (despite being given a misnomer…), and was incorporated as a feature within the core ranking algorithm. This means that backlinking shortcuts are monitored around the clock, and can no longer have a positive effect on any site (hooray, justice is served!). As the Penguin algorithm served to prevent previous blackhat SEO practices, newer updates incorporated into the ranking algorithm looked to the future – such as RankBrain.
5) RankBrain; Artificial Intelligence
Last but not least this final update falls under the Hummingbird algorithm revamp, adding to the overall goal of interpreting, understanding and conforming to user-intent. Instead of using the formula of a more basic search query algorithm like PageRank, RankBrain incorporates a new and shiny interpretation model (which also considers variables such as the user’s location, personalisation, keywords, search history, etc.) to facilitate the discerning of true intent to provide the most relevant results.
But how does this work you ask? Well, sadly, it’s not magic. Google HQ still hasn’t quite managed to make miracles (…Yet). RankBrain instead uses AI (Artificial Intelligence), through a combination of machine learning from existing data as well as from the user. RankBrain is essentially learning, growing, and becoming more specialised with every search, as it has already been disclosed as the 3rd most important signal in terms of contributing to a SERP (behind links and content, of course) And has only been around since 2015.
So, how do you use this to your advantage as a digital marketing? Checking the overall health of your site (visitors, CTR, bounce rate, etc.) initially is key, as well as determining if your page needs a redesign. Along with this, making sure the front end of your page includes value-adding, eye-catching and relevant content is key. Still not convinced your page has a competitive advantage? Try putting yourself in the shoes (or rather, at the keyboard) of the user. What do you search for/what’s your intention? Remember, it always comes back to the user.
To put it simply, SEO strategies are constantly evolving. As previous industry hacks like bulky backlinking and keyword padding become ineffective and penalisable in favour of tactics that focus on the quality of content, links, and display adaption, following the latest SEO trends becomes increasingly important. With 1 out of 5 search queries now happening via voice command, AI (artificial intelligence) technology advancements in search processes, and other digital developments underway, many more SEO changes are sure to follow; and the saga continues.