Banter white on transparent background

Acknowledgement of Country

Banter Group acknowledges the Gundungurra people as the Traditional Owners of the land on which our agency stands, and extend this 
respect to all First Nations peoples, including 
Elders past, present and emerging.

Email Marketing continues to deliver the highest return on investment within the digital marketing space. This is an edited transcript of a 30 min webinar, Optimise your email marketing with brilliant testing techniques.

My most favourite subject matter when it comes to email marketing is optimising with brilliant testing techniques. One of the reasons that I love testing so much is you can see the immediate impact of your tests. This is the fourth webinar in the series, so if you’ve missed out from the start, catch up with:

Series 1: First Impressions With Subject Lines

Series 2: Designing Emails That Convert

Series 3: Finding Your Content Niche

Email marketing is all about small steps that make big differences. I’m focused on the two steps highlighted within this process, which is opening an email and clicking within an email; and how to optimise against opening and clicking.

Why Should You Test Email Campaigns?

Email is your highest return on investment channel within digital, hands down. If you can test and improve your campaigns, you can increase the ROI by making small and impactful changes.

Testing allows you to compare and contrast different elements of your email campaigns to see how they impact your subscribers’ reactions to them.

Creating A Hypothesis For Your Email Campaign Testing

When it comes to optimising and testing techniques, there are two key techniques that you can use.

Your first step is to form a hypothesis about what you think will occur, and then formulate a test to find out what really happens.

Testing allows you to compare and contrast what parts of your email are performing and what your subscriber’s reactions are to them.

A hypothesis or a test framework is to determine what you want to test (the control), and what do you want to compare it to (the challenger).  Your control then compares your baseline upon which you continue to test again, and introduce new challenges along the way.

Technique 1:  A/B Testing

The first testing technique is called an A/B test.

On the left you have version A, which is your control, and on the right, variation B, the challenger, to determine which option is actually going to drive the greatest result. The key to an A/B test is you have to test like for like. This means you should only test one element at a time. In this image, that’s the area marked pink and green, showing the headline or image section. This isn’t testing a headline in control A and an image in challenger B, we’re testing the same area. That’s the key to an A/B test, it has to be comparing apples to apples.

How Does An A/B Test Work?

Starting with your whole database, you will have two campaigns determined, variation A and, variation B. Determine what percentage of your database will receive variation A and what percentage of your database will receive campaign B. This should be the same percentage. The winning variation will be sent to the remainder of the database.

In this diagram, 10% of the database is allocated to variation A and 10% to variation B, leaving 80% of the database to receive the winning variation.

What Should You Test?

Subject Line Test

The first test that you could run as an A/B test is a subject line test.

There are quite a few variations on what to test, including:

  • Using an emoji
  • Using personalization such as a name, suburb or previous behaviour
  • Time-sensitive actions required
  • Offers
  • Questions

Button Test

Another test option is the colour, shape and text of call to action buttons.

In this example, the variation A is using a text-based button and variation B is using a coloured button. The text has remained exactly the same, it’s just the type of button being used for the call to action.

This test resulted in a 27% increase in click-throughs by using a button instead of a text link. That’s a significant increase in impact with a very minimal change.

Many people believe that green means go and red means stop. Not for this next test!

In this simple button colour test, the red button increased conversions 34% compared with the control green button over 600 visits.

Landing Page Test

Email campaigns typically send customers to a landing page, usually on a respective website, in order to take action. This next test is one of the best landing page tests I’ve seen, this one from Macquarie Bank.  It’s a simple, repetitive test – the landing page repeats the information multiple times on the page.

In the challenger landing page, the design is reduced – with one set of buttons, and moving the headline underneath the bullet points. This simplified challenger actually increased the response rate and the project’s return on investment by a whopping 547%. This is enormous. For banking, the idea of taking out a loan or setting up a trading account in this instance is a huge return on investment to the organization. It’s a very simple test to run for an enormous uplift.

Distraction-less Pages – Invisible Navigation

Here’s another test, also in the finance industry. This is what’s called distraction-less pages, and you can see that there is an invisible navigation. When the user clicked-through from the email campaign, the first test has the navigation menu visible, and in the challenger test, there is no navigation menu.

If a user has taken a click from your email, they’re willing and encouraged to be looking at that next step to take, which is on the landing page. It means the user is relatively committed to carrying out the actions. Looking at the response rate again, a 492% increase in conversion simply by removing the opportunity to click into other areas (or distractions) is a terrific outcome.

Cadence Test: High or Low Frequency

A cadence test simply means frequency. How frequently should you be emailing customers? And does it make a difference? In this first cadence test, there were 10 emails in the series (variation) versus a lower set of frequency, which is only six emails, being the challenger, variation B.

Fewer emails actually drove a 46% increase in orders in this example.

The sentiment here is a lot of businesses feel that if they just keep sending more and more emails through that they’ll receive a better response rate. In reality, high frequency can be too much for consumers. Fewer communications also creates a fear of missing out. Consumers aren’t hit with the same message of desperation as often, and they are more inclined to take an action.

Design Test

In this test, variation A is a brightly-coloured and detailed example on the left hand side, compared to variation B, a simple and short, less designed email on the right.

  • There are two very different styles altogether.
  • There’s the same messaging in terms of “let’s party” as a button.
  • Different colours.

Variation A – the big designed button actually had an increase in clicks and also an increase in revenue of 48%. That is significant when you are celebrating a birthday and also having a series of design elements that celebrate that as well.

A birthday test with a coupon discount might be very different to an informational test that isn’t a celebration, for example, or offering a discount.

There could be several things that might change, not just design. It could also be the messaging. In this instance, an A/B test is only testing design vs design.

Technique 2:  Multivariate Test

A multivariate test means you can test multiple elements all at the same time, represented in this image by the colours. There are two different coloured headlines and there are also two different coloured content blocks. Using a mix of four different options, there are a lot of different combinations. Now, depending on how many areas you choose to test, you end up with a significant number of permutations available. This means that when user 1 receives an email, it will look different for user 2, user 3 and so on.

How Does Multivariate Testing Work?

Using these permutations, the combinations will tell which of the variables out-performed the rest. Image A or image B? Headline A or Headline B, and what combination of those was most successful.

Multivariate testing uses the same core mechanism as A/B testing, but compares a higher number of variables, and reveals more information about how these variables interact with one another. 

Layout Tests

A combination of layout tests is a great way to devise which elements are working best for you. This example from HubSpot demonstrates four different layouts that build upon each other.

Introducing A Challenger

With each winning test you run, you should continue to build your learnings by introducing a new challenger. Just because one test revealed a winner, doesn’t mean that the same combination or element will continue to perform for your business.

Statistical Relevance

Short term tests still need to maintain statistical relevance to ensure your tests are robust. There is no point running a single test for a 5 day period and determining yourself that the test is valid. This is where you need statistical relevance. Using a simple statistical calculator and inputting your visitor numbers or database numbers, and conversions, you will understand your conversion rate. This will assist you to determine when a test has achieved statistical relevance. This means you can take action with confidence that you test is a valid one.

Some campaigns might need to run for two days or two hours. Others might need to run for two weeks or two months. It really depends on what volume of database you have or visitation that you have and what volume of conversion that you see in what amount of time.

Testing Options

What to test is half the fun of the best. Here’s a hit-list you can work through.

  • Frequency
  • Intervals
  • Day of the month
  • Day of the week
  • Time of Day
  • Subject Line
  • Pre-header text
  • Offer
  • Creative and Copy
    • Length
    • Specific or generic
    • Positive or negative language
  • Call to Actions
    • Button colours
    • Button words
    • Text v button
  • Opt-in forms
  • Newsletter templates
  • “From” name
  • Transactional Notices
  • Auto-responders

Top Tips For Successful Tests

  1. Always keep your tests really simple.

Start with something that you really want to know is going to make a difference. I would probably start with a subject line test and then a day of the week followed by a time of day test.

  1. Always have a hypothesis. Be clear about what test you’re trying to prove or disprove.
  2. Always split your database with a large enough segment to ensure statistical relevance.
  3. Always introduce a new challenger with every test. Keep learning!

I hope this helps build your confidence in running great email marketing test campaigns. Optimising your content for your audience will continue to deliver improved ROI on the best ROI channel in digital. If you’re needing some help in getting your campaigns off the ground or coming up with ideas on what to test, drop us a line, we’re here to help.