We all want to increase traffic but don’t always want to expand into other marketing channels or increase spend. Ad copy testing can be a great place to start in achieving that goal.

Ad testing within campaigns can sometimes be a task in the audits that gets overlooked but can often be the most effective and easiest way to improve campaign performance. After all, this is the only part of a whole search campaign that a customer actually sees, so don’t neglect it.

The impact of reviewing your ad copy can also have powerful effects across your other marketing channels such as email, social, SEO.  By evaluating what ad copy is working well for PPC, the copy can then be used elsewhere and see if there are the same improvements with engagements from customers. Having the same messaging across multiple touchpoints can also help with customer engagement as all of the brands marketing channels are aligned and will resonate with customers.

Best Practices

When running ad copy testing it is important to check the settings of your campaigns and ensure that ad rotation is set to ‘do not optimise’. This will ensure that the ads get fair visibility and there hasn’t been any bias in play by Google preferring certain ads to others; you can either do this in Google Ads Editor or in the Google Ads interface. It is important to check because when the ad copy is reviewed, it is going to be tricky to evaluate and notice any significant differences if Google have been preferring certain copy.

When it comes to reviewing the tests, the quickest and easiest way to evaluate which ads need a little TLC is by reviewing the CTR of the ads and looking at the number of impressions and clicks each ad has received; this is best done on at least 3-month basis to avoid any anomalies such as trends, traffic, paydays etc.

Labels

Another function that can make testing ad copy easier is by using labels, these can make reviewing comparable changes easier and are great when wanting to segment different ad attributes in your ads and campaigns, for example, 1 label for generic copy, 1 for offer copy & 1 for price led copy.

To add even more granular testing it is also best to test the same section of the ad copy. For example, using the labels above, you would want to have the variations within headline 2 only so that the rest of the ad copy is the same, and headline 2 is the only different variable.

When you come to then report on your different ad variations, you can then filter your ads by ‘label’ and clearly see which attribute/USP is driving the highest click-through rate (CTR) based on the same level of impressions & clicks. It can be clearest to run this report within the report section of Google Ads.

If you then have significant differences in the ad copy when it comes to looking at positive metrics (clicks, CTR, conversion etc), you can then set the top-performing ad copy and move onto to test other parts of the ad copy such as headline 3, descriptions etc and follow the same process as above.

Something else to consider is when you have launched a new product or range. To begin with, you may have initially started with wording such as ‘discover’ or  ‘explore’ to gain the traction & excitement from customers and allow them to browse and gain inspiration. Then, once the new product or range is then established, change the inspirational wording to more intent wording such as ‘shop now’ or ‘while stocks last’ etc to gain urgency. You can then review this copy from start to end of when the traffic and sales really began to pick up – which performed better?

Quick Testing

The ad variation tool within the drafts & experiments in Google ads can be a powerful tool to use but is not widely used. This allows you to run experiment campaigns that don’t compete with other campaigns in your account – so spend will not increase, nor will your ads be competing with each other.

This section of the platform allows you to run tests across either the whole account,  specific campaigns or ads that you want to improve.

If you were wanting to trial a quick test within the headlines of an ad or see which landing page converts better, you can simply use the ‘find and replace’ section. Similar to setting the ad to ‘not optimise’, you want to set these experiments to work at a 50% split test so that results are fair and budget is shared evenly.

For one of our clients, we have been running a campaign experiment across a few campaigns whereby the keywords have had a low-quality score. We wanted to test to see if by changing the landing page of the ad copy would increase the quality score. The quality score successfully doubled for the keywords so we chose to apply the experiment to the original campaigns.

Within another experiment, we are running an ad variation test by comparing visiting a specific product description page (PDP) or going to a whole range landing page, with the aim of seeing which landing page sees a higher conversion rate (CVR).

Reporting

Once you have run the experiment for a length of time (ideally 4+ weeks to gain enough data), you can then see the performance via the reporting section which is always visible at the top of the experiment page. This shows the % differences in metrics between the 2 variables.

If performance has been strong for the experiment campaigns/ads, you can then simply click the ‘apply’ button and these will transfer into the original campaigns.

These experiments are great to test the quality score of keywords, landing pages, CVR, CTR in different ad copy and you can test your whole campaign, while not affecting your current campaign.

Don’t Let It Become Like Wallpaper

It is crucial to not allow your ad copy to become like wallpaper; refresh it to align with trends & seasonality. Customers continually want to see appealing messages from advertisers so don’t become complacent with generic messaging for your own ease. Persuade your customers to click though!

Claire Henley

PPC Executive

Claire is a PPC Executive at Impression.

Claire has specialist knowledge in PPC.

Leave a Reply

Your email address will not be published. Required fields are marked *