Well this is exciting! We're happy to have you on board.
Fill out your info and book a product tour.
No, I don't want to see how it works
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Why Creative Testing is Still a Blue Ocean

And Why So Few People Test Effectively
Brett Friedman

Every day I ask over 800,000 people a new question, on average I receive 20 responses per question. Today, I got four responses. That's just a fifth of the usual response. Just 20%.

The question: "Do you run a/b/n or multivariate tests for your ad creative?"

The answers:

Sophia H. Yang, Head of Marketing, Fig Loans

"Since joining Fig, I have revamped our branding and advertising strategy, including the creation of our A/B/n testing of creative assets, copy, and landing pages for our ads. We actively test several variations of our ads to see which resonate the most with our potential borrowers, and can bring in the most conversions.

With COVID-19 completely changing how advertisers are approaching online advertising, the most recent standout results we've seen have been with changes to copy.

Copy testing has also been the smallest change that has led to the largest return. We chose to run a couple of our most effective creative ads to the same audiences, only altering the copy to call out coronavirus & stimulus checks. After a few weeks of testing, we were able to achieve nearly 250% the click-through rate and a 20% discount to our acquisition cost with the COVID-related copy.

Though some advertisers have been shying away from directly mentioning coronavirus, our testing has proven the benefits of trying out more aggressive copy with your most successful creative assets."

Andrej Bachtin, Digital Marketing Consultant

"A/B testing is incredibly important to me as it allows me to run ads a lot more effectively. For example, I've changed the CTA button on a LinkedIn Lead Gen ad from Get a quote to Learn more, which resulted in a 20% increase in Lead Form Submissions."

Jeff Moriarty, Marketing Manager, Moriarty's Gem Art

"All of our ads have at least two tests running against each other at all times. Many times the one we think would have the best results ends up being the exact opposite. One strategy is using the Keyword Insertion strategy. This always leads to the best click thru rates, but normally poor conversion rates. Our company tested our Countdown option for one of our sales recently. It lead to a 4% increase in click thru rates and our conversion rate doubled. Took 2 minutes to set up."

Cameron Dunn, Director, Paid Search, SQRD Media

"I run A/B tests with every campaign I run. Google Ads, Facebook, LinkedIn, etc. I've seen dramatic changes in conversion rate by just tweaking a few words. I've seen messaging that did exceptionally well to a female audience, but performed miserably in front of a male audience. Through it all, I've learned that my instincts aren't always right and that I should let my target audience decide which messaging is best. Data, not gut, should guide direction."

The results are clear, but four people is a small sample size. So take it from Facebook with 96% of the advertising market using their platform: 'Testing results in a 26% reduced CPA'.

Ad testing has been around for well over a century. Sure the tech has changed from papyrus to press, from paper to computer, but overall, the process is the same scientific method as it's always been.  So you're left to wonder why hasn't it seen mainstream use.

The answer is simple: it's hard. It's hard to design proper experiments, it's hard to know where to start, it's hard to figure out which elements actually improve performance and which hurt... creative testing is hard. Most people aren't willing to do the hard things. We follow the path of least resistance.

However, where there's challenge for most people, there's opportunity for others. If you're willing to do the hard work, you'll be rewarded. Generally, marketing success comes when you've already been where people are going. You're in a marketplace before it's competitive. You're in a blue ocean. No bloody competition to dye the seas red yet. For example, if you posted frequently to Instagram before it exploded, you'd already have followers, your account would get recommended to new users, and your reach would exponentially increase as the platform grows.

That being said, it's not 1879 anymore. Google Optimize and Facebook's Split Testing tools let you quickly and easily set up basic experiments. Has the wave already passed?

No. Both lack key the ingredient to testing success - the real barrier to entry for the average marketer - discrete creative data. Google and Facebook can tell you which ads work best, but not why they work best. With stringent testing procedures, you can deduce out of image copy effects and even which images outperform others, but overall, you can't learn if it's the sun and sky in your ad or the model's brown hair, the word art-esque font or the actual words in-image.

Fortunately, you're here reading this right now. Today, there's no tool that gives you data you truly need to make informed creative decisions. But there will be soon. Within the next few months, Marpipe will be released to the world. At first, only a select few people will be selected to use our platform, but gradually advertisers will all become expert testers and audiences will love the ads they're served.

If this sounds like a world you'd like to live in, if you'd like to ride the wave before the beach is crowded with surfers, sign up for our waitlist in the footer below, on our homepage, or the dedicated waitlist page!

Get A Product Tour

How to Run a Multivariate Test

The Beginner's Guide

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Tiffany Johnson Headshot

How to Run a Multivariate Test
The Beginner's Guide

Plus, Get our Weekly
Experimentation newsletter!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.