Is your Media Mix correct? Find out how much revenue you're leaving on the table at your current spend with our free audit.
Is your spend allocation correct? Find out how much revenue you’re leaving on the table with our free audit.

Running an A/B Test with Google Drafts and Experiments

September 4, 2018

When it comes to PPC campaign management and optimization, A/B testing is key.

In my past life working for agencies and directly with brands, we tested new creatives every two weeks. By constantly running these tests, we were able to better understand which exact wording drove people to click ads (given all the competition on the SERP) and identify the best combination of ad elements when it comes to CTR and CVR.

To put our plan into action, we went through these phases of every A/B test:

  • Preparation
  • Execution
  • Analysis
  • Preparation for the next test (based on learnings)


Preparation


Measuring Success

It’s crucial to outline measurement KPIs to understand what a successful test looks like. For example, is the end goal to drive incremental conversions and/or revenue, a specific ROI that the test objects have to hit, or traffic growth (impression share, clicks, etc.)?

Having a clear picture of success will make analysis a lot easier, and help you quickly identify your test winner.

Test Elements

When you create a new test, review what you used in the past. What worked and what didn’t (for creatives)? How does performance look right now (for bidding)?

Always test one element at a time. Including several unique elements into your test may compromise the results. The goal is to identify the exact element that drives your performance to the next level.

Execution


Once you’ve solidified your methodology and elements, it’s time to set up the test.

While you have various implementation options at your disposal, one way to run a PPC test is with Google’s drafts and experiments. According to Google, using drafts and experiments “lets you propose and test changes to your Search and Display Network campaigns.”

Drafts and experiments campaigns mirror selected campaigns and create a complete duplicate (draft), where you can change test variables.

Once you’re happy with the changes and testing object within the campaign, convert the draft into an experiment and make it live.

There are a few thing to keep in mind when you’re launching an experiment campaign for A/B tests.

Gather Historical Data

Since experiment campaigns are created from scratch, you won’t have any historical data (i.e., quality score). So, to make sure you run an accurate test against the existing setup, allow at least two weeks for the experiment campaign to gather historical data.

Use the Right Parameters

Depending on the tracking solution you’re using, review the elements you track and attribute on. Experiment campaigns are created by mirroring existing (i.e., control) campaigns, and objects like keywords and creatives will have duplicate publisher IDs.

Some advertisers use the {creative}Google ValueTrack parameter for the creative ID to attribute conversion data at the creative level. In this case make, sure you ‘recreate’ your ads for the experiment campaign before launch, to generate unique publisher IDs.

Select the Right Budget Split

Google Ads allow advertisers to select a budget split between their control and experiment campaigns. While many advertisers select a 50/50 split, keep in mind that various factors may affect the actual split during your test.

For instance, impressions / clicks / cost data will never 100% match the selected budget split, since the settings allow you to only split spend and not the SERP auction. Also, campaign settings won’t cap your campaign budgets, and in some cases the traffic split may shift toward one of the tested campaigns.

By way of example: one of our clients decided to test two different bidding strategies in their accounts. While we initially selected the campaign’s budget split as 50/50, over time, traffic (impressions, clicks, and cost) shifted to the experiment campaign, since the LTV assigned to conversions in the campaign was much higher. This resulted in higher bid calculations and higher traffic volumes.

Analysis


Well done—you’re now on the finish line of your first test!

If you prepared well, this step will be nice and easy. You already know which metrics you’re aiming to improve, so simply download data for your control and experiment campaigns and review the results based on your KPIs.

The next step: prepare for your next test, analyze the results, and keep improving your accounts. ;) Ready, set, go!

Testing with Marin


Here at Marin, we’ve have built a feature that allows you to seamlessly track and accurately attribute conversions at all levels, without the need to recreate publisher IDs for any of the tested elements. Contact your Marin Customer Success team to learn more. Or, if you’re new to Marin, just get in touch.

Aleks Nikitina

Marin Software
By submitting this form, I am agreeing to Marin’s privacy policy.

See why brands have relied on Marin to manage over $48 billion in spend