How to deal with Rankbrain and AI in SEO
Lionel starts with a quick reminder of how a search engine works – discover, organise and rank. AI provides realtime reranking. Google consumes as much energy as San Francisco annually to do this, which takes a lot of their resources.
Crawl budget is extremely important to Google – how much will it cost to crawl your website’s pages?
How do you organise a crawl effectively?
Prioritise URLs according to your important pages. Schedule important pages first. Optimise the way that Google will discover and crawl those pages. Desktop and mobile crawlers are not necessarily going to discover your pages in the same way.
Analyse your log files to discover how frequently your site is being crawled. It is much better to use these than to use third party tools and data. CReate alerts when crawl frequencies drop or increase. Log files are the only accurate source of data.
Algorithms use variables to decide how often to crawl, but what are they? They decide to crawl based on technical quality, popularity, semantic scores and user behaviours.
Internal metrics like page quality make a difference, as do external metrics like backlinks.
Once indexed, Google scores your pages to rank them – we know that various factors affect this.
Artificial Intelligence is involved at multiple stages of the process. Machine learning is an iterative process used to improve AI functionality.
What about RankBrain?
A query will match with Google’s knowledge base, but RankBrain determines how satisfied users are with the results that are returned. If users don’t appear to be satisfied, RankBrain will reshuffle the rankings to try and improve satisfaction.
Help Googlebot understand what your site is talking about and help it to see the right pages in the right places.
Page Importance is a query-independent score based on the quality of information within a certain page. Look at the patent to find out more, but it’s basically a score for each and every page. One factor is Google’s PageRank metric.
Internal PR, sitemap inclusion, the file type of the page, quality and spread of anchor text, number of words, duplication ratio and the importance of the parent page are all key factors in determining page importance.
Google doesn’t like to keep digging too far into a site, so you need to position the most important sections of the site closer to the home page.
We also know that higher word counts improve rankings and improve Google’s crawl ratio.
We should also pay attention to internal PageRank, of which internal links are an important part. We can use Search Console to give us some information to help us see what Google might consider to be more important.
Google is also sensitive to click through rates and bounce rates. Lower BR = more bot hits. Better CTR = better ranking positions.
Remember, priority pages should be linked 1-2 clicks away from the home page – no more! Focus on the content and the way it’s being delivered. Your page needs to be informative, developed and FAST. Page performance is key, so do whatever you can to improve it.
We know that Google classifies intent based on navigational, transactional and informational – Google behaves differently for different intents. For transactional, work on brand, product info and informative content on product pages.
Informational pages need to include a lot of good linking signals and clear named entities.
Navigational queries are served with good semantic optimisation, as well as trust and citation flow signals.
Word embeddings is a term of language modelling and learning that RankBrain uses – it is part of the foundation of RankBrain and is concerned with the relationships between words, which are mapped onto mathematical vectors. Each entity and content is vectorised, which is a way of computing these verbal relationships. Google can then evaluate the distance between two concepts. For each entity Google knows all the context that it should be able to see around it. What words are normally used with it? What sentences does it appear in? Etc.
Google uses concepts to understand concrete queries.
You need to look at semantically related content on your site and how it’s interlinked.
Crawling, indexing, ranking and reranking is all based on ML principles.
Never forget that AI models are still algorithms. You can manipulate them if you know the signals they are looking for. Remember to follow crawl budget with your logs as well!
Make users want to come back by manipulating CTR and BR.
This post is one of 28 in our Brighton SEO 2018 collection
- Brighton SEO: Christoph C. Cemper – 20 Free SEO Tools You Should be Using
- Brighton SEO – Ways to to definitely get links for your business
- Brighton SEO: 5 Truths The Gurus Won’t Tell You About Facebook Ads
- Brighton SEO: Alex Major – Comparison Shopping: The Future of Google Shopping Ads
- Brighton SEO Keynote – Live Google Webmasters Hangout with John Mueller & Aleyda Solis
- Brighton SEO – Killing giants and competing in the SERPs
- Brighton SEO: Jeroen Maljers – Hidden Messages: The Psychology Behind PPC & SEO
- Brighton SEO: Arianne Donoghue – The PPC Automation Revolution Is Coming
- Brighton SEO: Laura Hogan – Big Links for £0
- Brighton SEO: Nichola Stott – Speed metrics in context of the UK Top 5,000 websites
- Brighton SEO 2018: We need to talk about competitor campaigns
- Brighton SEO: Bastian Grimm – Web Performance Madness: Critical Rendering Path Optimization
- Brighton SEO: Rob Bucci – Featured Snippets From Then To Now, Volatility, & Voice Search
- BrightonSEO 2018: Fili Wise – Optimising for SearchBot
- Advanced & Practical Structured Data
- Brighton SEO: Gavin Bell – Amplifying Your Content With Facebook Ads
- Brighton SEO 2018 : Craig Campbell – Risks and Rewards of PBNs
- Brighton SEO: Chelsea Blacker – Taming the Wild West of ASO
- Brighton SEO: George Karapalidis – Using machine learning and statistical models to predict revenue potential for search
- Brighton SEO: Barry Adams – Technical SEO in the Mobile First Indexing Era
- Brighton SEO: Kaspar Szymanski – Understanding Google Penalties by ex-Googler Kaspar Szymanski
- Brighton SEO: Mark Thomas – How much positive impact can crawl budget optimization have in a mobile first index era?
- Brighton SEO: Chris Liversidge – Using Machine Learning Technology To Build Audience-Led Analytics
- Brighton SEO: Emily Mace – Diagnosing Common Hreflang Tag Issues On Page & In Sitemaps
- Brighton SEO: Steve Rayson & Giles Palmer – How Metrics and Data Drive Advocacy Effectiveness
- Brighton SEO: Tom Anthony – Diving Into HTTP/2 – A Guide For SEOs
- Brighton SEO: Tom Pool – Command Line Hacks For SEO
- Brighton SEO: Eleni Cashell – How to Unleash The Power Of Unique Content