Sometimes the littlest things make the biggest difference. It’s an adage our customers have proven over and over again by multivariate testing their ad creative on our platform.
By mixing and matching a set of creative variables within a modular ad design, and testing every combination of those variables, they’re able to discover incremental changes that add up to huge performance gains. (Some of them are doing it to the tune of 30% YOY growth.)
Minute changes like this work best once you’ve found an ad that’s already performing well with its intended audience. Once you have that high performer, subtle changes can help boost conversion rates even more.
Take the four sets of ads below, for example — all real ads our customers tested on our platform. Each set is nearly identical but had hugely different conversion rates. And they all illustrate a small but powerful (and proven!) tweak you can add to your own creative testing plan.
Shape or angle of text box
The field of color that contains your ad’s text is an easy place to play around with small changes. SolaWave, a direct-to-consumer skincare company, tested eight total ad variants in this experiment. Half of them featured a wavy line down the middle and the other half featured a straight line. The straight line resulted in a 350% increase (!!!) in add-to-cart rate.
Model vs. no model
Human presence, or lack thereof, can have an enormous impact on ad performance. Taylor Stitch, a sustainably made, direct-to-consumer menswear brand, wanted to see which product images performed better with potential customers — those featuring models or those featuring only the clothing. While eleven image variants and 198 ad variants were tested in total, the image and ad without a model shown here proved to be the winner. The result was a 66% decrease in cost-per-purchase.
This type of insight can have greater implications, beyond just ad performance. If images without models perform better than those with models, you might want to rethink how you photograph your products — and how much you really need to budget for talent.
Different calls to action
How you ask your audience to take action has a massive bearing on whether or not they actually will. Testing multiple CTAs can help you find the most effective one.
Acadia, a digital marketing platform for mid-market companies, wanted to understand how different CTAs impacted the performance of their recruitment ads. They tested eight total ad variants, half of which featured “Grow your career” and half of which featured, “We’re hiring.” The more concrete “We’re hiring” had a 170% higher click-through rate.
Graphics can help quickly convey what your product or service offers. The key is to try multiple graphics that are similar enough to express the same idea, but different enough to test.
Cactus Credit helps its customers rebuild their credit profiles for a better financial future. Their marketing team wanted to see if a slightly different shape of credit meter graphic would have an impact on lead generation.
Their test contained 12 total ad variants. Half of them featured a half-circle-shaped credit meter, while the other half featured a nearly full circle shape. The full circle graphic generated — wait for it — 53x the leads.
The real kicker with this example is that the ad on the left had been Cactus Credit’s top-performing ad for years. It was easily bested with this one swap, which the Cactus Credit team never would have known about without multivariate ad testing.
It’s hard to believe that improving your conversion rate could come down to something as simple as a straight line or a slightly different CTA. But that’s the magic of multivariate testing — because you can control and test so many variables at once, you uncover creative intelligence you would never be able to with traditional testing methods.
The possibilities don’t end with these five examples. Take a look at your current top-performing ad. What kinds of small changes could you test that might boost its ability to convert?
We’ll get you started. Here are a dozen more ideas you can add to your future testing plans:
This might seem like a lot of testing options — and it is. The good news is that you can spread these ideas out over weeks and months of testing and learning.
In general, you’re looking to learn one or two insights with every multivariate test you run on your ad creative. Strategically apply those learnings to future tests and, little by little, you’ll evolve your ad creative into a conversion-driving powerhouse.