3 steps – audit mobile site, optimise crawl budget and grow organic revenue.
The second stage is mostly aimed at sites with 10000+ pages.
Mark introduces a case study, with AMP already implemented across the site. They’ve already switched to mobile first indexation and implemented subsequent crawl budget optimisation. CBO is a thing that has been referred to a lot and it’s still worth pursuing post-mobile first.
Between the ‘internet’ and the SERPs there are a series of steps including crawling, parsing and indexing. The URL universe that Google’s looking at is not just the amount of pages on your site today. Historic information could still be in the equation somewhere along the line – migration is not the end of the story.
The next step is the principle of crawl budget – each site has a time-bound allocation of crawl budget determined by crawl demand, which comes from interest in the site. There are also practical limits around how often Google can request information, which is another factor influencing how much Google will look at your site. Every site has a finite budget and we have to use it effectively.
Google has told us that the mobile index will become the preferred index moving forward. The crawler that Google is using in the future will be a mobile crawler that will request the mobile version of the page. Up until now the index has been informed by the desktop crawler. The mobile crawler will be used 80% of the time.
A key takeaway is that we should be using the mobile agent to crawl our sites in the future.
However, sites are being individually evaluated and transitioned slowly. If you really read into things we have to understand that every single SERP is an experiment. In the mid-term period there’ll be a mixture of both desktop and mobile indexes shown at the moment. It might fully switch by July but we can’t be 100% sure.
John Mueller tells us to monitor server logs to see which agent is crawling your site. You can also monitor your Google Search Console notifications, which should occur when you’ve migrated. To be honest, you should just look at your server logs a lot more in general! See where Google is focusing and where there might be risks.
So what should we look for?
Inspect everything you would normally inspect in your site structure, but crawl with a mobile agent.
You can use Google’s mobile friendly test, usability tool in Search Console and Chrome Dev Tools to run various experiments. Don’t forget to test the JS functionality of your mobile site as standard.
Compare the results of a crawl from desktop to mobile, even if you have a responsive site. There can be more changes from one device to another than you realise.
- Structural KPIs e.g. load times, depth, internal PageRank, duplicate metadata, structured data etc.
- HTML is the display language which tells the browser what to display. JS is the behaviour language, which tells a browser how to respond to behaviour on a web page.
Remember, you can plan for the mobile shift. It’s not completely out of your control.
- Prune and assess your content.
- Make sure internal redirects are clean.
- Avoid inconsistent signals, e.g. audit your canonical tags.
- Use robots.txt effectively.
- Improve site structure
- Watch out for 304 status code.
- Use 410 status code to tell Google when content is completely gone.
Identify how different parts of your site are performing – not every subfolder is going to perform the same. Look where Google is spending a lot of time vs where you’re getting a lot of traffic. You want to match your high traffic areas to the areas that Google is crawling a lot. The key comparison is #Crawls vs #Visits.
Pruning content as it becomes unnecessary means less work for the crawler to do and more focus on priority pages. Managing your site well can lead to a significant percentage increase in how much current content Google’s crawlers are looking at.
The end result should be traffic growth!