Sam, with the help of the community, starts his talk by first defining what an enterprise is, stating that it is a mix of the large size of an organisation and length of the decision-making chain. Often analysts define enterprises as just businesses with large amounts of pages but in reality, when we look at examples like British Gas and Lloyds Bank, they can sometimes have only have about 3,000 indexable pages shared across their whole portfolio of digital properties.
Reflecting back on his experience at Deep Crawl he states that there are 4 main challenges that can commonly be found across enterprises;
1. Size and scale
2. Auditing and prioritisation
3. Reporting and monitoring
4. Velocity of completion
Size & Scale – Issue amplification
Even for a website that is 3,000 pages in size, the size and scale of issues found can easily be not focused. Analysing each page can be resource intensive and it’s not time efficient to extract issues for an entire website. He suggests segmenting and slicing a website for technical analysis can help keep your analysis focused and manageable.
How to create a tactical crawling roadmap?
1. Limit the crawl
First, you must identify the key area of the website you want to focus on and limit the crawl to just this sub-folder or sub-domain.
2. Sampling & slicing with maximum analysis / low resource
Following the completion of the crawl, you must then slice the data to create a sample size to maximise analysis with a low cost of the resource.
3. Benchmark analysis
Finally, you need to benchmark the KPIs related to the issues you’ve identified in a way that can be easily communicated to stakeholders.
It was quite reassuring to hear that the structure Deep Crawl use to conduct their technical audits is in line with how we work here at Impression. With detailed analysis of the issues being the core of the document followed by recommendations as to how to remedy the affected areas. All rounded off with a brief summary of each action point towards the end of the document with who’s responsible for each activity for quick reference.
What are the most common issues enterprise level websites suffer from?
1. The proportion of primary pages -A lot of pages aren’t crawled and indexed (High Priority)
2. Rendered crawls – e.g. iframe breaking the head
3. Crawl depth – Exceeding 20 steps
4. International SEO – Hreflang implementation.
Sam uses an example of a technology business that has extensive international pages. The enterprise had created language-specific pages at such a scale that there were language pages created for an audience as small as 200 people. In a world where crawl budget can limit the crawl-ability and indelibility of large websites, it poses a question as to is it worth the resource?
Reporting & monitoring
There are of course examples of enterprises with large amounts of digital properties. Sam uses the example of a publisher that Deep Crawl helped consolidate over 300K subdomains of which I’d agree that without proper planning could be a logistical nightmare.
With a page count of this size, a site migration can be risky and so the need for a robust reporting setup which can benchmark KPIs is a necessity. The communication method that Sam and Deep Crawl have found that works the best for enterprises is to create a view for each level of the organisation. The reporting requirements of a developer would be completely different from a board of directors so why would you consolidate all this information in one view.
For developers, he has written a blog post that can help you to visualise Google Search Console issues to communicate indexation issues effectively.
Some additional examples of dashboards that can work well with enterprises can be found here;
1. Keyword cannibalisation
2. Speed metrics dashboards
Velocity of completion
With enterprises comes a large chain of decision-makers which can hinder the timeliness of your implementation. He advises, that to combat this you must be on top form with your soft skills and to get real buy-in from stakeholders you must;
– Build a strong internal and external network of people
– Be a hub of influence between these departments
– Prove ROI to score budget and resource
– Lead with revenue KPIs and not SEO KPIs
Overall I thought the session was well structured and had lots of actionable tips and ideation to really get it done in an environment which can often come with a lot of red tape. I’ve definitely taken away some ideas on how I can improve the experience for my own clients.
This post is one of 12 in our Search Leeds 2019 collection
- Search Leeds: Sophie Coley – Search Listening: How and why we should be using Google data way beyond traditional keyword research
- Search Leeds: Jon Greenhalgh – A.I. in Paid Media: strategies you can use tomorrow
- Search Leeds: Stephen Kenwright – Content marketing frameworks that will get you paid more
- Search Leeds: Women in Digital – Confidence in public speaking and the workplace
- Search Leeds: Women in Digital – How to Maintain a Work Life Balance
- Search Leeds: Britney Muller – Machine learning for SEOs
- Search Leeds: Sam Marsden – Overcoming Technical SEO challenges for enterprise sites
- Search Leeds: Emily Potter – Featured snippets – the achievable SERP feature
- Search Leeds: Rory Truesdale – Intent optimisation – why it matters and how it can improve your SEO results
- Search Leeds: Bastian Grimm – Why most SEO audits are sh*t
- Search Leeds: Kirsty Hulse – How science can help you have better ideas