Beta

How to Remove Roadblocks Advertising to Better Ad Testing

Struggling with your ad testing process? Find out how removing roadblocks from advertising can help you achieve better ROI and uncover insights quickly.
Pierce Porterfield

Multivariate testing (MVT) is a powerful way to determine the best-performing ad creative — and best-performing creative assets. However, its complexity can often be a roadblock for brands that want to take advantage of it. Here are the major hurdles we see brands facing when it comes to adopting MVT, and how Marpipe is making it easier for more brands to benefit from this powerful testing method and the rich creative intelligence it delivers.

The major hurdles brands face when trying to use MVT

We've broken them down into three categories: creative execution, organizational culture, and display-side control.

Creative execution

The first and most obvious hurdle is the creation of ad variants for a proper MVT test. It can be done manually, but it's slow and unwieldy. Marketing teams run into the following execution issues most often.

  • Scaling complexity. Without some sort of automation in place, multivariate testing at any kind of scale is humanly impossible. Designing and resizing every ad variant manually is tedious, can lead to burnout, and takes away from more strategic creative thinking and ideation.
  • Modular design. Traditionally, creative teams think of an ad design as one defined unit. The creative assets within — copy, image, etc. — only make sense when paired together. This approach is too inflexible for MVT testing, in which creative assets must be interchangeable to be tested properly.
  • Asset storage. Creative asset storage is incredibly utilitarian and hasn't evolved in the way that other creative automation solutions have. They merely house assets and don't give any insight into the performance data of those assets — which should be the main factor considered when choosing which elements to use in an ad.

Organizational culture

The second hurdle is the environment and long-held cultural norms that have relegated ad testing as a nice-to-have rather than a business imperative. These are the common roadblocks brands run into in this arena.

  • Misalignment of creatives and marketers. Creatives and marketers too often work in silos. Marketers get very little insight into the creative process and creatives receive very few learnings that can be applied to future ad creative. Goals (e.g. producing big bold ideas vs. producing ads that convert consistently) are often seen as at odds, too.
  • The bias of the traditional creative approval process. Ad creative and design decisions are rarely backed by any valid data, and are rather fueled by deeply held opinions and "intuition." This is a difficult cultural norm to break as it delegates decision-making power to the highest ranking people on the team — who rarely want to lose that power.
  • The withholding of creative insights. Often, multiple teams from multiple vendors and agencies work on one brand. It's a challenge to ensure that all teams share knowledge about what performs for the brand and what doesn't, so all touch points along the buyer journey can follow those guidelines. It's rare to have one source of truth for creative insights that anyone who touches the brand can tap into.
  • Seeing testing as a short-term investment. Because of the short-term nature of traditional ad testing, it's not a viable form of long-term market research. With A/B testing, for example, each testing cycle is self-contained in nature — with no previous learnings to be applied, and no new learnings to carry forward. This makes it a hard sell in terms of making it a continuous investment.

Display-side control

The third and final roadblock is the loss of control brands experience at the hands of ad serving platforms. Even when brands prioritize testing their ad creative, the following challenges render the test results difficult to interpret at best, and unusable at worst.

  • Limited line of sight into creative data. Ad serving platforms are notorious for not giving marketers the visibility they need into how their ads are being tested nor the resulting data. This makes it nearly impossible to understand which variables are most likely to affect ad performance, not to mention which changes to make in order to improve performance.
  • "Optimizations" causing uneven spend. In an effort to eke out more performance, ad platforms will automatically put more budget behind an early high-performing ad, resulting in uneven spend across all ads in the test. This can be frustrating for marketers who are trying to control all possible variables for valid test results.

How Marpipe is making it easier for brands to take advantage of MVT

Marpipe was designed for brands to easily implement MVT testing and reap the benefits of the resulting creative intelligence. Here's how we're breaking down the main barriers they face.

On the creative execution front, Marpipe has developed an automated solution that eliminates the need for manual design and resizing of ad variants. This not only saves time and resources but also allows for more strategic thinking and ideation from creative teams. 

In addition, Marpipe's modular approach to design ensures that creative assets are interchangeable and can be effectively tested. We've built (and continue to build) our platform with an eye on full creative freedom. Features like custom text wrapping, individual edits, and crop variants give creatives the flexibility to design around their assets, and not be constrained by them. 

Finally, every asset within Marpipe is tied to its historical performance data, making it easier to know for sure which images, headlines, colors, and more should be used to de-risk performance.

When it comes to organizational culture, our platform inherently aligns creatives and marketers by giving them a common set of goals and source of creative data — no more working in silos or waiting on performance data to know what to do next. These two once disparate teams can now make sense of all the data that a multivariate test generates, together. Marpipe's platform analyzes and visualizes your data for you (buh-bye mind-melting, manually created spreadsheets) so everyone can quickly identify what's performing and why.

With performance data tied to each creative element, design and messaging decisions can now be driven by data and not fueled by opinions or biases. Debates around which colors, images, and headlines will work best can be a thing of the past. 

Also, each ad test in Marpipe adds to a library of historical creative data, so long-standing insights can be shared freely between all brand teams. And the performance increases realized from MVT testing on Marpipe builds a case for continuous testing, effectively becoming a new form of real-time R&D.

Finally, Marpipe puts brands back in full control of their testing campaigns. Our platform provides detailed, transparent data — owned by brands themselves — on every single creative asset within your test: images, copy, CTAs, background colors, offers, and more. 

Plus, we structure your tests so that each ad variant is in its own ad set, which means that we can control for an even spend across all variants.

Helping you leapfrog the hardest parts of multivariate testing

While an inarguably powerful tool for marketers, multivariate testing can be difficult to adopt due to current hurdles surrounding creative execution, organizational culture, and display-side control. 

Marpipe is making it easier for brands to take advantage of MVT by removing these obstacles. By doing so, brands can benefit from the rich creative insights MVT provides, moving them lightyears ahead of those that don't.

The importance of MVT to marketing success cannot be understated. It has become essential for uncovering the best-performing ad creative and assets — and strategically capitalizing on them.

Boost ad performance in days with a 7 day free trial.
Claim Trial

How to Run a Multivariate Test

The Beginner's Guide

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Tiffany Johnson Headshot

How to Run a Multivariate Test
The Beginner's Guide

Plus, Get our Weekly
Experimentation newsletter!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Outliers is the weekly newsletter that over 10,000 marketers rely on to get new data and tactics about creative testing.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Are you crazy...

about catalog ads? You’re not alone. Join over 8,000 other marketers in The Catalog Cult - the world’s best newsletter about catalog ads.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.