Beta

Best practices for high-performance creative

A look at the best practices for high-performance creative – and common pitfalls to avoid.
Jess Cook

Performance creative has a singular focus: to drive a desired outcome or behavior from the user. In the world of performance ad creative, that typically means engagement, clicks, or purchases.

Performance creative is usually highly trackable, produces rich data that can be analyzed by marketers, and is switched out frequently to determine what most often gives the desired outcome.

Examples of paid social performance creative

This means that performance creative is always changing and evolving, and that can make it hard to keep up. But never fear — we’ve compiled some best practices for high-performance creative, so you can stay ahead of the curve.

Common mistakes to avoid with performance creative

There are a few common mistakes that marketers make with performance creative, which can really hurt the overall effectiveness. Here are a few to avoid:

Making decisions based on assumptions

This is probably the number one mistake. Marketers often make decisions about what will work best without actually testing it first. Just because something worked in the past, or you think it should work, doesn’t mean it actually will.

Studies show that marketers can only predict winning ad creative 52% of the time. So while using your creative instincts can be helpful, they shouldn’t be the sole basis for creative decisions.

Using inadequate testing techniques

Even if you are testing, you might not be doing it effectively. Make sure you’re using the right methodologies — such as sequential or multivariate testing — over A/B testing.

A/B testing, which measures the performance of two or more markedly different creative concepts against each other, leaves a lot to be desired in today’s advertising landscape. Audience targeting has become diluted, social platform and ad network algorithms change sporadically, and brands and agencies are constantly being pushed to do more with less.

Better performance creative — nay, the best possible performance creative —  is the fastest way around these major market shifts. So while A/B testing isn’t obsolete (any testing is better than none at all), it does not provide the level of data marketers needed to build “one ad to rule them all.”

Testing creative sporadically

Testing should be an ongoing process, not something you only do every once in a while. The more you test, the more data you’ll have to work with, and the more insights you’ll gain — huzzah!

For example, let’s say you find that images of women increase ad performance. You could then test women of different ages to see if any of them give you an incremental boost. And then you could test women of the winning age group across many different ethnicities.

Probing your winning performance creative further can help you unlock conversion rate increases you didn’t even know were there.

Best practices for high-performance creative

Now that we’ve gone over some mistakes to avoid, let’s talk about best practices. Here are a few things you can do to make sure your creative is set up for success.…

Know your audience

This one should be a no-brainer, but you’d be surprised how many marketers don’t really know who they’re marketing to. Take the time to understand your audience (or audiences) in terms of demographics, pain points, needs, and desires.

Let these insights drive the choices you make about the assets (images, colors, copy, etc.) you include in your performance creative.

Start with a hypothesis

Asking yourself, “What do I want to learn here?” is the very first step in kicking off a creative test. A well-thought-out hypothesis will help inform which assets — images, headlines, calls to action, etc. — should be tested in your ad creative.

Without a hypothesis, you lack the rationale for which assets you choose to test leaving you unable to categorize the images you choose in a meaningful way. This causes your test to lack focus, meaning the data won’t be as powerful or meaningful as it could be. 

Automate strategically

Using automation can help you build high-performance creative faster than ever before. There are tools that speed up the performance creative design process, tools that launch structured ad tests, tools that help you analyze your creative data — and some that do all of the above so you can build, test, and analyze 3x faster.

Automating your performance creative workflow can take your ad optimization to the next level.

Have — and test — clear offers

The offer is the heart of any piece of performance creative, so it’s essential that it be crystal clear. If users don’t know or like what they’re getting, they’re less likely to convert.

You also have to understand which kinds of offers appeal to which audiences. (See “Know your audience" above.) % off, $ off, BOGO, and free shipping are great places to start.

Keep on testing

Continuous testing leads to continuous learning. This arms you with data about exactly which assets historically work, and which don’t. You can apply learnings not only to future ad creative but to every other brand touchpoint along the customer journey — billboards, audio ads, site design, packaging, etc — increasing performance and conversion rates holistically. 

Marpipe’s approach to performance creative

Marpipe takes the guesswork out of performance creative. Instead of the A/B testing scenario where you only learn the overall ad winner, Marpipe helps you see the data at an asset level.

When you build an ad in Marpipe, everything you use to create your ad has performance data tracked at that asset level. At the end of the test, you’ll not only know which ad won, but you’ll also know exactly which pieces of the ad were winners, too. Marpipe aggregates all the data for each of those assets, so no matter what they’re paired with, you will know how they performed overall. 

Here are some of the features that set our platform apart:

Modular design approach

Modular design is a design approach that uses placeholders within a template to hold space for creative elements to live interchangeably. It's a foundational pillar of designing ads at scale on Marpipe, and what allows each design element to be paired with all other design elements programmatically. This gives you total control over all your variables for an effective test.

Example of a modular ad template with placeholders for image, logo, headline, CTA, and button.

Within Marpipe, you can break your placeholders down into two types:

  • Variable: Any placeholder for an asset you’re going to test. These assets should be interchangeable with one another. In other words, you can swap any asset of the same type into the placeholder and the design still works.
  • Fixed: Any placeholder staying the same across every ad variation. These assets should work cohesively with your variable assets that are being swapped in automatically. One common example of a fixed asset is your brand logo. It typically stays the same size, color, and location in each variation.
Modular ad design with fixed and variable elements labeled

Placement Variants

Designing ad creative to appear in every possible placement size — while still keeping its intrinsic content intact — is an extremely time-consuming process when done manually. 

Placement Variants lets you design ad variations in multiple sizes all at once. All of the placement variations you edit will be available when launching your test. 

Placement variants demo

This is a huge timesaver for creative teams. By creating one ad, you’re actually creating ads in every size you could possibly need — without having to constantly start from scratch.

Built-in Confidence Meter

Many advertisers prefer their ad tests to reach statistical significance — or stat sig — before considering them reliable. (Stat sig refers to data that can be attributed to a specific cause and not to random chance.)

Marpipe is is the only automated multivariate creative testing platform with a built-in live statistical significance calculator. We call it the Confidence Meter.

In real-time, it helps you understand:

  • whether or not a variant group has reached high confidence
  • if further testing for a certain variant group is necessary
  • whether repeating the test again would result in a similar distribution of data
  • when you have enough information to move on to your next test

Template library

Your ad template is the controlled container inside of which all your variables will be tested. It should be flexible enough to accommodate every creative element you want to test, and yet still make sense creatively no matter the combination of elements inside. (This is where modular design principles become super important!)

Marpipe has more than 130 pre-built modular templates for you to choose from in our library — all based on top-performing ads. Or you can build your own. To make sure your template exhibits the look and feel of your brand, you can upload brand colors, fonts, and logos right into Marpipe for all your tests.

Test budgeting made simple

Your overall ad creative testing budget will determine:

  1. how long you run your test 
  2. your budget per ad group 

The larger the per-test budget, the more variables you can include per test.

Marpipe shows you how your budget breaks down before you launch your ad test. So if you create more variants than you have creative testing budget for, you can simply remove elements and shelve them for a future test. (Vice versa: if you find you haven’t included enough variants to hit your allotted creative testing budget, you can add variables until you do.)

In-platform budget calculator

Marpipe also places every ad variant into its own Facebook ad set, each with its own equal budget. This prevents the platform algorithm from automatically favoring a variant and skewing your test results.

Powering multivariate creative testing is next to impossible without a platform like Marpipe — for serious paid social spenders, it’s the best way to handle creative testing. 

A demo can help answer your questions and show you how it works.

How to Run a Multivariate Test

The Beginner's Guide

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Tiffany Johnson Headshot

How to Run a Multivariate Test
The Beginner's Guide

Plus, Get our Weekly
Experimentation newsletter!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.