In the modern SEO landscape, practitioners are almost powerless without their tools and the data they provide. Experience is still essential but we need the data first to leverage this experience after. Only then can we properly interpret shortfalls when it comes to more technical and data-driven optimisations.

Nowhere are SEO tools more rife than in the technical SEO camp. From cloud crawling tools such as DeepCrawl, OnCrawl and Botify to local solutions like Screaming Frog and Sitebulb, it can be an expensive space to operate in.

What’s more, the deeper you dive, the more of a requirement tools become. Log file analysis, for example, can be achieved with a simple Google Sheet and other free solutions, but it’s far easier to interpret server logs using solutions like those mentioned above and more specialist tools like Logz.io, Loggly and Splunk (to name but a few).

The tricky part then comes after. Once you have obtained access to these tools, they largely work in silos and are disjointed from the remainder of your workflow. That means tying your efforts back to your core performance-based KPIs can be difficult and take some navigating.

In addition, the tools I noted above typically come with a high barrier to entry, meaning other stakeholders struggle to visualise your efforts in the technical SEO space. This can make some conversations difficult, especially when it comes to signing off budgets and communicating the value of your work.

That’s where Google Data Studio (GDS) can help. With a large library of custom data connectors, you can near-enough visualise any data you wish to complement another data set and tell a story. At Impression, we typically refer to these datasets as “owned goals” and “shared goals” (first coined by Laura Hampton in a digital PR context but the methodology can also be applied to a technical SEO-driven search strategy).

Here, “owned goals” refer to the metrics we can directly improve, e.g. crawling and indexing optimisations, while “shared goals” refer to those we share with in-house teams and other SEOs to help influence, i.e. improvements in visibility, traffic acquisition and revenue. With Google Data Studio, we can capture both owned and shared goals and start connecting the dots.

Many data connectors already exist that allow you to include technical-SEO driven data into your reports but after researching online, I discovered few (if any) templates exist that pull this all together. With that in mind, we created a technical SEO dashboard that’s now available for download.

What does the technical SEO dashboard report on?

Technical SEO metrics

Though a work in progress, the dashboard currently captures:

  • Automated data from DeepCrawl, including:
    • Crawling considerations
      • Indexable URLs
      • Non-indexable URLs via:
        • Non-200 status codes: 3XXs, 4XXs, 5XXs
        • Robot Directives: canonicals, meta robots noindex/nofollow tags and disallows via robots.txt
      • Crawl optimisations available through your XML sitemap(s)
    • Indexing considerations
      • Thin URLs
      • Duplicate URLs
      • Orphaned URLs
  • Automated Google Analytics data, including:
    • Page load times
    • Server response times
    • Domain lookup times
      * available via various dimensions, e.g. site-wide, page, device and user-level
  • Automated Chrome User Experience data, including:
    • First paint
    • First contentful paint
    • DOM content loaded
    • First input delay
      *using the origin URL originally stated upon setup of the data connector.

While the idea is to build on this over time, we consider these to be integral to a technical SEO’s “owned goals” when related to a campaign’s overall progress and success. By adding this dashboard into your current GDS reports, you will be able to supplement the KPIs related to your “shared goals”.

Quarterly progress

The dashboard is only intended to report on quarterly progress. Granted, data over a longer period of time is always more valuable but with so many individual reports present, we were limited by space. Three months was the trade-off to include enough reports for it to be useful while showing enough of a date range for it to still be interpreted in a user-friendly way.

The dashboard is set to report on the previous quarter and should update automatically upon the following month of each completed quarter, i.e. April for Q1, July for Q2 etc.

Getting started

The dashboard’s current iteration only uses three sample data connectors. These are:

  1. The Google Sheets connector
  2. The Google Analytics connector
  3. The Chrome UX Report connector

Before making a copy of the dashboard and getting stuck in, we recommend following these steps to ensure you successfully add your data to the connectors before adding them into the template:

  1. Pull in automated DeepCrawl data to a Google Sheet
    Feeding page 1 of the dashboard, the easiest way to retrieve automated DeepCrawl data is via Zapier using the methodology Adam Gordon details here.This is still API driven at its core but since it leverages Zapier to feed the data from DeepCrawl to Google Sheets, it’s simple to deploy – even for less experienced technical SEOs.With Adam’s first point, “Setup the Google Spreadsheet”, all you need to consider is formatting your Google Sheet similar to how we structured ours to pull in the data like-for-like:https://docs.google.com/spreadsheets/d/15nhbIrdxeAFBRAEOB9iB75NIfo2GK0Nk3-HGwmj1Lzw/edit?usp=sharingThe reports from DeepCrawl then need to correspond to your setup when you are configuring the trigger via Zapier, i.e. Finished At, Project Link, Primary Pages, Duplicate Pages etc. Ensure your crawls are then repeated monthly to get continuous month-on-month data.The spreadsheet then needs to be pulled into GDS using Google’s proprietary Google Sheets connector.
  2. Create a Google Analytics Data Connector
    The Analytics Data Connector feeds the first half of page speed metrics found on page 2 of the dashboard.This needs to correspond with the site you have analysed via DeepCrawl and CrUX and you need “Read & Analyse” permissions at least to pull your analytics data through to GDS. Goes without saying!
  3. Create a CrUX Data Connector
    This connector feeds the last half of page speed data found on page 2 of the dashboard.The documentation to create the entire CrUX dashboard is detailed here and again, the exercise is super simple. To make things easier still, all you really need for the purposes of Impression’s technical SEO dashboard is the Chrome UX Report data connector.You can only choose one origin URL as part of the CrUX data connector; it can either be the homepage or a landing page that’s representative of your wider technical SEO campaign.Please note, historic data via this connector can take longer to become available.
  4. Et voila!
    With your data connectors now ready, feel free to proceed with making a copy of the technical SEO dashboard for your personal use.Upon duplicating the template, just remember to set your data connectors accordingly so your site’s information is pulled in. You’ll be prompted to do so immediately upon making a copy of the dashboard:Google Data Studio - Copy this report option

Next steps

This dashboard is only really applicable to DeepCrawl users for now. However, over time, we can look to elaborate on it with data feeds from other technical SEO crawlers like those from Botify and OnCrawl. In addition, there are still more useful reports to see even from the likes of DeepCrawl, like those related to crawl depth and log file metrics, so we’re looking forward to building on the dashboard over time.

And that’s really our intention; for it to be a work in progress so it becomes increasingly more valuable. if you have any feedback of things you’d like to see via the technical SEO dashboard, feel free to share those with me via email or Twitter.

Petar Jovetic

Head of SEO

Petar is the Head of SEO at Impression and specialises in content, technical SEO and digital strategy. He's guaranteed to be the only guitar-wielding, digital marketing-talking, Montenegrin you know.

Leave a Reply

Your email address will not be published. Required fields are marked *