Paid Social: Small Variables of A/B Testing Can Drive Higher Conversions
What a difference a small change can make. Higher conversions could be hiding behind your non-performing paid social campaign. A/B testing of even small variables could reveal changes that will lead to better performance. Here are five angles to consider.
1. Getting Visuals Just Right
The same campaign with a different promotional visual could have vastly different results. While there are plenty of truisms about visual advertising in general, it’s hard to say which particular visual asset will drive the most conversions. Enter A/B testing.
Consider testing a full-frame visual social campaign against a rich media thumbnail ad preview version. Each has its benefits. The big image will tend to get more eyes on it, but a social post with a thumbnail and link feels more clickable.
Images of a person or group of people sometimes perform better than abstract graphics or a photograph with no humans. It can take some tweaking to get this right, too. What the people are doing or looking at can also make a difference.
2. Experimenting With Word Choice
Words matter almost as much as visuals and the same ad with slightly different copy could make a difference. Does calling an online freebie an “ebook” get you more email leads than saying it’s a “marketing guide”? The only way to know for sure is to test. Test both versions of the ad, then optimize by shifting ad spend to the better performing version.
3. Leading With a Data Point
Advertisers can gain a 50% conversion boost by including a specific data point, such as a statistical figure — but, make sure your stat is real — this one’s just for show!
However, for the right campaign, an interesting — and true — statistic or data point can make all the difference. Try A/B testing your next campaign with two versions: one that leads with your stat and another without.
4. Varying Copy Length
Whatever your social or ad network platform, you know there’s a character limit for the text portion of your ad. But a 150-character limit shouldn’t mean all your ads are 149 characters. Variety could show you how to boost your performance.
There’s no hard and fast way to tell beforehand whether short, medium, or long will be best. That means you’ll need to try an A/B test to see which ad performs better.
5. Varying the Image Text
Many digital ads today will use ad copy overlay on the image asset itself. The same tweaks can be applied to copy. The wording, the inclusion of a data point, even position on the image can make a difference.
Choose one and only one variable at a time to play with and split test. If one version of the ad performs better, try another variation against it to see if you can boost the conversion rate even more. Several iterations may be needed with a new campaign to get the most from an ad.
Other A/B Testing Tips
The variables possible for imagery and copy can make a big difference for your conversion rate, but make sure to set your split test up for success. To effectively test your variables, you need some other elements to be carefully controlled.
Only alter one variable at a time. For instance, change a few words of the copy, but leave the rest of the copy and the imagery the same.
Run both ad versions at the same time. If ran in a series, your A/B test results could be skewed by other factors — differing audiences, audience behavior patterns, etc.
Keep the bid rates comparable. There may end up being some variation in bid prices but try for comparable numbers.
Lastly, run your split test long enough to establish a benchmark, usually two weeks plus. If your campaign is too short, your results could be thrown off by holidays or other uncontrolled variables.