How to deal with Rankbrain and AI in SEO

Lionel starts with a quick reminder of how a search engine works – discover, organise and rank. AI provides realtime reranking. Google consumes as much energy as San Francisco annually to do this, which takes a lot of their resources.

Crawl budget is extremely important to Google – how much will it cost to crawl your website’s pages?

How do you organise a crawl effectively?

Prioritise URLs according to your important pages. Schedule important pages first. Optimise the way that Google will discover and crawl those pages. Desktop and mobile crawlers are not necessarily going to discover your pages in the same way.

Analyse your log files to discover how frequently your site is being crawled. It is much better to use these than to use third party tools and data. CReate alerts when crawl frequencies drop or increase. Log files are the only accurate source of data.

Algorithms use variables to decide how often to crawl, but what are they? They decide to crawl based on technical quality, popularity, semantic scores and user behaviours.

Internal metrics like page quality make a difference, as do external metrics like backlinks.

Once indexed, Google scores your pages to rank them – we know that various factors affect this.

Artificial Intelligence is involved at multiple stages of the process. Machine learning is an iterative process used to improve AI functionality.

What about RankBrain?

A query will match with Google’s knowledge base, but RankBrain determines how satisfied users are with the results that are returned. If users don’t appear to be satisfied, RankBrain will reshuffle the rankings to try and improve satisfaction.

Help Googlebot understand what your site is talking about and help it to see the right pages in the right places.

Page Importance is a query-independent score based on the quality of information within a certain page. Look at the patent to find out more, but it’s basically a score for each and every page. One factor is Google’s PageRank metric.

Internal PR, sitemap inclusion, the file type of the page, quality and spread of anchor text, number of words, duplication ratio and the importance of the parent page are all key factors in determining page importance.

Google doesn’t like to keep digging too far into a site, so you need to position the most important sections of the site closer to the home page.

We also know that higher word counts improve rankings and improve Google’s crawl ratio.

We should also pay attention to internal PageRank, of which internal links are an important part. We can use Search Console to give us some information to help us see what Google might consider to be more important.

Google is also sensitive to click through rates and bounce rates. Lower BR = more bot hits. Better CTR = better ranking positions.

Remember, priority pages should be linked 1-2 clicks away from the home page – no more! Focus on the content and the way it’s being delivered. Your page needs to be informative, developed and FAST. Page performance is key, so do whatever you can to improve it.

We know that Google classifies intent based on navigational, transactional and informational – Google behaves differently for different intents. For transactional, work on brand, product info and informative content on product pages.

Informational pages need to include a lot of good linking signals and clear named entities.

Navigational queries are served with good semantic optimisation, as well as trust and citation flow signals.

Word embeddings is a term of language modelling and learning that RankBrain uses – it is part of the foundation of RankBrain and is concerned with the relationships between words, which are mapped onto mathematical vectors. Each entity and content is vectorised, which is a way of computing these verbal relationships. Google can then evaluate the distance between two concepts. For each entity Google knows all the context that it should be able to see around it. What words are normally used with it? What sentences does it appear in? Etc.

Google uses concepts to understand concrete queries.

You need to look at semantically related content on your site and how it’s interlinked.

In summary…

Crawling, indexing, ranking and reranking is all based on ML principles.

Never forget that AI models are still algorithms. You can manipulate them if you know the signals they are looking for. Remember to follow crawl budget with your logs as well!

Make users want to come back by manipulating CTR and BR.

This post is one of 28 in our Brighton SEO 2018 collection

Ben Garry

Content Specialist

Ben is a Senior Content Specialist in Impression’s SEO team. He helps clients stand out through on-page optimisation and original, high-quality content. In his spare time, Ben can usually be found playing a board game or reading a comic.

Ben has specialist knowledge in SEO and On Site Content.