Beta

How to Increase Ad Conversion Rate with One Small Change

Discover the power of one small change and learn how it can help you increase your ad conversion rate. Get tips, strategies, and best practices here!
Jess Cook

Sometimes the littlest things make the biggest difference. It’s an adage our customers have proven over and over again by multivariate testing their ad creative on our platform.

By mixing and matching a set of creative variables within a modular ad design, and testing every combination of those variables, they’re able to discover incremental changes that add up to huge performance gains. (Some of them are doing it to the tune of 30% YOY growth.) 

Minute changes like this work best once you’ve found an ad that’s already performing well with its intended audience. Once you have that high performer, subtle changes can help boost conversion rates even more.

Take the four sets of ads below, for example — all real ads our customers tested on our platform. Each set is nearly identical but had hugely different conversion rates. And they all illustrate a small but powerful (and proven!) tweak you can add to your own creative testing plan.

Shape or angle of text box

The field of color that contains your ad’s text is an easy place to play around with small changes. SolaWave, a direct-to-consumer skincare company, tested eight total ad variants in this experiment. Half of them featured a wavy line down the middle and the other half featured a straight line. The straight line resulted in a 350% increase (!!!) in add-to-cart rate.

Model vs. no model

Two nearly identical Taylor Stitch ads

Human presence, or lack thereof, can have an enormous impact on ad performance. Taylor Stitch, a sustainably made, direct-to-consumer menswear brand, wanted to see which product images performed better with potential customers — those featuring models or those featuring only the clothing. While eleven image variants and 198 ad variants were tested in total, the image and ad without a model shown here proved to be the winner. The result was a 66% decrease in cost-per-purchase.

This type of insight can have greater implications, beyond just ad performance. If images without models perform better than those with models, you might want to rethink how you photograph your products — and how much you really need to budget for talent.

Different calls to action

Two nearly identical Acadia ads

How you ask your audience to take action has a massive bearing on whether or not they actually will. Testing multiple CTAs can help you find the most effective one.

Acadia, a digital marketing platform for mid-market companies, wanted to understand how different CTAs impacted the performance of their recruitment ads. They tested eight total ad variants, half of which featured “Grow your career” and half of which featured, “We’re hiring.” The more concrete “We’re hiring” had a 170% higher click-through rate.

Interchangeable graphics

Two nearly identical Cactus Credit ads

Graphics can help quickly convey what your product or service offers. The key is to try multiple graphics that are similar enough to express the same idea, but different enough to test.

Cactus Credit helps its customers rebuild their credit profiles for a better financial future. Their marketing team wanted to see if a slightly different shape of credit meter graphic would have an impact on lead generation.

Their test contained 12 total ad variants. Half of them featured a half-circle-shaped credit meter, while the other half featured a nearly full circle shape. The full circle graphic generated — wait for it — 53x the leads.

The real kicker with this example is that the ad on the left had been Cactus Credit’s top-performing ad for years. It was easily bested with this one swap, which the Cactus Credit team never would have known about without multivariate ad testing.

Adding subtle creative changes like these to your ad testing plan

It’s hard to believe that improving your conversion rate could come down to something as simple as a straight line or a slightly different CTA. But that’s the magic of multivariate testing — because you can control and test so many variables at once, you uncover creative intelligence you would never be able to with traditional testing methods.

The possibilities don’t end with these five examples. Take a look at your current top-performing ad. What kinds of small changes could you test that might boost its ability to convert? 

We’ll get you started. Here are a dozen more ideas you can add to your future testing plans:

  • Changing background color
  • Trying different headline fonts
  • Trying different font colors
  • Placing copy on left or right side of ad
  • Placing copy on top or bottom of ad
  • Placing brand logo in different corners
  • Increasing or decreasing logo size
  • Using different button shapes
  • Cropping images differently
  • Featuring the same product in different colors
  • Using differently shaped arrows in front of your CTA
  • Using images of the same model with different facial expressions

This might seem like a lot of testing options — and it is. The good news is that you can spread these ideas out over weeks and months of testing and learning. 

In general, you’re looking to learn one or two insights with every multivariate test you run on your ad creative. Strategically apply those learnings to future tests and, little by little, you’ll evolve your ad creative into a conversion-driving powerhouse.

Boost ad performance in days with a 7 day free trial.
Claim Trial

How to Run a Multivariate Test

The Beginner's Guide

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Tiffany Johnson Headshot

How to Run a Multivariate Test
The Beginner's Guide

Plus, Get our Weekly
Experimentation newsletter!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Outliers is the weekly newsletter that over 10,000 marketers rely on to get new data and tactics about creative testing.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.