Get in touch with our team
Feature image for 13.03.2018

13.03.2018

16 min read

Engineering your website architecture for great search engine optimisation (SEO)

This article was updated on: 07.02.2022

Being both a web developer and SEO consultant, technical SEO is a real passion of mine. Oftentimes, having a great website architecture can be a make or break situation for your search engine visibility. Whether you’re building a website from scratch or reassessing the existing structure of an enterprise site, the theory is the same: keep your website’s architecture as simple and organised as possible.

The impact of a great site architecture

Before getting started on any on-site optimisation process, it’s important to understand how a search engine works and to understand Google’s mission:

To organize the world’s information and make it universally accessible and useful.

And in order to complete its mission, Google must be able to understand how to navigate around a website and, more importantly, understand which pages contain the most important information. It does this through something not too far removed from its proprietary IP which dates back to the mid 90’s; PageRank.

Many organisations have since sought to emulate this and Moz has perhaps contributed the most early on in this field with its introduction of Page Authority and Domain Authority. No matter which metric you use however, the idea of “link juice” / “link equity” flowing from one page to another and aggregating there is common and is the key factor to consider when striving to surface the best content within your site’s architecture. I’ll look at how this idea applies to site architecture in more detail later on.

What optimal site architecture looks like

Google likes short URLs, but it also likes a sense of order. Therefore, there’s no right answer to this question — only guidelines. When structuring a website, we try to keep the pages quite flat and don’t bury them in folder URLs such as /site/, /pages/ or /category/, but if we’re grouping a set of common themes, we will generally group these into a virtual folder/directory. You can see this with our digital marketing services, such as SEO (/digital-marketing/seo/) & PPC (/digital-marketing/ppc/).

With most modern content management systems, like WordPress, adding breadcrumbs is simple. Navigation breadcrumbs and URL patterns often match up pretty well, and that’s the case if you use Yoast’s Breadcrumbs, which are also great for adding better internal linking — more on this later! As a result, we’re happy with having our main service pages one level deep, under our digital marketing services page.

Consider though, a website with a more complex architecture, like the one embedded here.

In this, the URLs for the important pages — the products — would be buried in at least three virtual directories which add nothing towards the ranking benefit of the page. It’s been shown that having relevant keywords in the path name and page slug/URL is beneficial to ranking the page for a given keyword, but when a website already has, or has the potential to have an unnecessarily long URL, this should be seen as a search optimisation opportunity.

At Impression, we’re big fans of Writemaps for drawing sitemaps, but there are plenty of other free tools out there such as Google Drawings. As Joost at Yoast points out in his article on the same topic, it’s always a good idea to visualise the structure to ensure the site feels balanced.

Through drawing out a proposed site architecture (and therefore URL structure), you’ll notice any imbalance or pages that might be too deep, allowing you to correct the structure before you start developing the site.

In fact, whether or not you anticipate your site’s structure being complex, creating a sitemap in advance of a new site development or migration is always best practice. Not only does it allow you to identify potential issues with page depth and site imbalance, it also allows you to visualise the grouping of your content to ensure that it is logical and accessible for both search engines and users.

If you’re working on a project where you need sign off from multiple people, a sitemap will also help to ensure that everyone is in agreement on how the new site should be structured before too much time is invested in development.

Logical content grouping

What constitutes the logical grouping mentioned in the last section? Simply put, a site’s architecture should group URLs that have similar or related content. The most obvious example is grouping products within the same category together, e.g:

  • /category-a/
    • category-a/product-1/
    • category-a/product-2/

This is a better alternative to a less clear structure, which could look like /cat/category-a/ and /prod/product-1/ and would fail to communicate the relationship between the category and its products. Unfortunately (from an SEO point of view) some content management systems (CMS) use the less clear structure as their deafult.

My preferred type of structure can also be used for services and related pages on a service-based lead generation site, with ‘child’ services nested beneath the main service, for example:

  • /business-law/
    • /business-law/business-fraud/
    • /business-law/trading-standards/

A good site structure doesn’t mean that different types of pages have to be separated. As we’ll see in the next section, evidence points to there being significant value to including informational pages within the same folders as service or category pages.

Content silos

Content silos are areas within the site structure dedicated to a particular topic that can include different types of pages, including product, service and informational pages. They’re an alternative to keeping all informational content separate in a blog, allowing search engines and users to find information on certain products and services more easily.

Siloed content could include specific FAQs or guide pages that provide more information than an area’s category content and product descriptions. We’ve seen evidence of this approach significantly improving keyword rankings.

A legal client of ours saw rankings for the primary keyword targeted by one of their key service areas fall from around page eight to hovering around pages one and two, simply by adding informational content to the same folder.

Why are silos so effective? Impression’s Edd Wilson is currently working on content silos for some of our larger ecommerce clients. “I believe sub-pages situated within a silo highlight you as an authority on that topic to Google,” he says. “As long as the silo is search engine friendly, a robot will be able to crawl the main landing page and identify all the content you have supporting it.

“Silos help you to keep your website structure clean and produce the ideal sitemap. They keep your topics perfectly categorised for Google and allow you to produce the pyramid structure that the Impression site is built on. I’ve definitely started to notice a trend where the highest ranking pages come from content silos.”

However, Edd also warns that you have to ensure that the keyword targeting for informational pages in the silo is different enough from the commercial pages that they don’t compete for the same keyword rankings, a situation that often leads to none of the competing pages ranking as well as you’d like. A sitemap makes it easy to map our your keyword targeting structure and make sure that significant overlaps are avoided. If you don’t plan in advance, you could end up with conflict that’s tricky to resolve retrospectively.

Architecture for informational content

Whether you’re siloing all of your content, just some of it, or you’re maintaining a regular blog (which is definitely a viable option, especially if you expect to be writing extensively about a wide range of topics), it’s valuable to have a few pages of content that you consider to be ‘evergreen’ (also called ‘cornerstone,’ ‘10X,’ and a variety of other terms).

Unlike other informational content, especially blog posts, these pages will be updated continuously to ensure that they’re always relevant. One way of thinking about them is that they should contain information that you want every potential customer to read. They should be valuable resources related to your products and services that will have long-term appeal and the ability to gain links over time.

Crucially, evergreen content should be easily accessible within your site structure, whether it’s linked to from the main nav or siloed with related categories or services. It should also be internally linked to heavily, with links coming in from every relevant commercial and informational page. This will help users to find their way there and signal to search engines that these pages are important.

For a site structure to be effectively crawled by search engines and usable for humans, it needs to have a strong internal linking structure. Internal links fundamentally allow search engines to crawl the entirety of the site; a page with no links pointing to it might as well not exist. They also highlight the relationships between different pages, strengthening the connections created by a logical folder structure.

In an article for Search Engine Land, columnist Paul Shapiro discussed the concept of Internal PageRank. This metric is based on PageRank, a well-known core part of Google’s algorithm that scores pages based on the number and quality of inbound links from other sites. Internal PageRank does a similar thing, but instead scores pages by the number and type of inlinks from other pages within the same site. Its aim is to provide a metric for showing how important Google might consider different pages to be relative to the other pages on your site.

Shapiro also describes a process for working out the Internal PageRank of your site using SEO tool Screaming Frog and the programming software R. Of course, this will only work on a site that can be crawled, so there’ll be no way to determine Internal PageRank if you’re at sitemap stage. However, you could work out the Internal PageRank for your site before it goes live if it’s set up on a staging site.

If you don’t have the time or resources to dive into R and Screaming Frog, you can check the number of internal links pointing to your pages with Google Search Console. The internal links tool is located in the main menu:

Bear in mind that the number of internal links doesn’t necessarily line up with Internal PageRank, as it does not take into account the type and context of those links, but it can still be a good indication of which pages might be considered more important. If you have a page that you don’t expect to see appearing with more links than others, it’s a sign that your Internal PageRank might be skewed.

Sensible URLs and naming conventions

A good site structure on its own is not enough to fully optimise a site’s architecture for search engines and users, even if it’s supported with well-planned internal links. Along with key considerations, your site architecture also needs to be made up of URLs with readable, targeted names.

Keyword targeting

When you’re planning your site architecture, it’s important to do so with the foundation of up to date keyword research. Using tools like Ahrefs, Adwords Keyword Planner and SEMRush, you can work out what people are searching for in relation to any page you create, whether they’re commercial or informational in nature.

Several elements of a page should contain the primary keywords that you want it to rank for, including the copy and the metadata, but it can also be helpful to include the main keyword in the URL. If possible, you should also avoid strings of numbers in a key page’s URL, as this waters down the topical signals you’re sending Google. Optimised URLs are not as powerful ranking signals as they were in the past, but it’s better to include them than not.

User experience

Well optimised URLs can also make your website easier for humans to navigate. The inclusion of the primary keyword in the URL’s string is one important element, helping to communicate the key topic of the page, but a logical folder structure, as described above, is also helpful.

If your folder structure is descriptive (e.g. /coats/parkas/ rather than /category/parkas/), it helps users see which area of the website they’re in and what else there is to explore. In the example given there, the fact that ‘coats’ is present in the URL structure may encourage a user to explore more of your products than if they only focused on parkas.

As with the SEO benefits, the UX benefits to well-optimised URLs aren’t overly impactful, but when you’re designing your site architecture from scratch it makes sense to all the small details right first time.

Other factors for consideration

There are a few more things to take into account if you want your site architecture to be as well-optimised as possible. Depending on the size and purpose of your site, you might not need to think too much about all of the following, but they’re important to get right for the site’s that need them and worth bearing in mind as you plan out your site or add more pages in the future.

Optimising poor performing pages

If you’re redesigning the architecture for an existing site rather than creating a new site from scratch, it’s likely that there’ll be pages that aren’t pulling their weight. These pages could be ‘orphaned’ through a fault in the implementation of the current site architecture, meaning that they don’t have any links pointing to them, or they might simply be getting negligible amounts of traffic and conversions.

These pages are unlikely to be providing any value to your site. Instead, they’ll just be increasing the number of pages that search engines have to crawl, which could detract from your more important landing pages. To prevent problems occurring in the future, consider whether or not these pages actually need to exist in your new site architecture, or whether any important information could be moved elsewhere.

If you can get rid of them completely, make sure that you redirect the old URL to a better, more useful page. This is better than letting the old URL 404, just in case there are any internal or external links pointing to it. If you decide to keep the page, its poor stats could be a sign that its content needs refreshing or it needs to be placed somewhere else in the site architecture to draw visitors’ attention.

Eliminating duplicate content

Duplicate content on a site, where content appears on more than one URL, is bad for SEO. Google’s Panda algorithm devalues duplicate content, which means that none of the pages it appears on will perform as well as they could organically.

The perfect time to make sure that your site has no duplicate issues is when you’re planning your site architecture. You can make sure that every page you’re planning has a clear, unique function. If two pages look like they’re overlapping, conducting some keyword research is a great way to check if there’s a worthwhile way to make a distinction between the pages. If you can’t find unique keywords for each of them, it’s probably better to cut the less important page.

Duplicate content can also occur on ecommerce sites where products are featured on multiple category pages, or two or more category pages contain similar products. In these instances, planning out your site architecture will help you to determine which pages are more important and what the most logical structure for the site is. If you see that a product will need to be featured in multiple categories, you can identify it now to make sure that the site is built in such a way that the page isn’t duplicated.

User-friendly conversion path

As well as helping your site to rank well, good site architecture should also create a smooth conversion path to take the user from their first landing page through to conversion. A good conversion path could look different depending on what you want your visitors to do, so you should be clear on this from the beginning of the planning process.

Primarily, it must always be easy for users to navigate to your core category and service pages. If there are important pages nested beneath then, as is usually the case with products on an ecommerce website, these also need to be clearly signposted. If visitors can’t find what they’re looking for quickly, it’s unlikely they’ll stick around to dig through pages of irrelevant content.

Every page needs to fit into some kind of conversion path, even if it’s not a product or service page. Blog posts can link to your evergreen content and any relevant categories/services to make sure that customers who are still researching have access to the best information you can offer.

In addition, it should always be easy for users to click on a contact button or to visit your product pages. At this point, we’re straying into the territory of web design rather than site architecture, but it’s always important to think about.

Lessons learned from years of best-practise SEO experience

Impression’s SEO consultants have experience working with clients with a wide range of requirements in all kinds of industries. We’ve seen and planned all sorts of different site architectures, which has led to a variety of insights that we can pass on to you.

Laura: “The ideal site structure is one that empowers the user to move easily through the content they need, and which supports them in completing the goals we are businesses want them to complete. I’ve implemented this through the Impression site and a number of client sites, using the conversion funnel as the framework for page development and implementing CRO techniques to drive that journey. SEO today is much more about user journey optimisation and has to be aligned with overarching business goals if it is to make a real difference to the business’ bottom line.”

Edd: “I’ve seen a huge internal linking benefit from content silos. Call to actions (CTAs) and content areas naturally link to the top level page and you’ll be able to use a wide range of anchor text across the sub-folder – varied anchor text helps your top-level page to rank for keywords that are related to the primary keyword. Implementing breadcrumbs also helps your internal linking and is recommended by Google.

“Silos help you to keep your website structure clean and produce the ideal sitemap. They keep your topics perfectly categorised for Google and allow you to produce the pyramid structure that Laura spoke about. I’ve definitely started to notice a trend where the highest ranking pages come from content silos.”

Sean: “Having worked on a few sites with some messy structures, I’d always recommend having a content strategy in mind that’s backed up by keyword research, otherwise you run the risk of cannibalisation.

“I’ve worked on sites with great informational pages produced with all the best intentions that were cannibalising the main product pages – this can be avoided by working out in advance what each page should be targeting, whether it’s a product page, a guide or something else.”

Chloe: “You should ensure that blogs are not competing with subcategories and that subcategories aren’t competing with main categories. When a site has been built without targeting in mind, you can see it suffering from fluctuating rankings, with different pages dropping in and out for the same keywords.

“I’ve personally seen this on fast fashion sites where blog posts are basically reviews or descriptions of ‘hot’ products and either outrank the product page or cause both to fluctuate for keywords where the search intent is definitely to buy a product.”

Pete: “It’s worth mentioning as a caveat the implications that changing an existing site’s structure can have on your rankings. While a proper architecture will be better in the long run, you’re likely to see decreases in rankings and traffic for a few months after implementing major changes.

“Ideally, you should roll out a new site architecture in a phased approach to isolate any particular issues that need fixing before moving to the next stage. If you can, you should start out with lower risk areas of your site before changing the higher performance pages, allowing those crucial transitions to be as seamless as possible.”