A/B testing alone doesn’t cut it anymore when it comes to ad creative. Even though it’s automated and typically executed in-market, A/B testing delivers the shallowest level of data available. It can tell you which of two or three ads was the higher performer — but never why.
There’s got to be a better way! *blows hair out of face, infomercial-style.*
In episode 3 of Resting Ad Face, our VP of Performance Marketing, Susan Wenograd, and I talked about:
Copywriters, art directors, and designers should want to know why their ads worked or not. And while A/B testing can tell you that an ad did indeed perform well, it can’t deliver the underlying data as to what made it perform well. There’s nothing concrete for creatives to pull forward into future ad concepts.
With multivariate testing, data can be used as inspiration. This removes some of the “fear of the blank page.” Creatives never really have to start from scratch because they have a deep, baseline understanding of what messaging, colors, images, etc. have worked (or not worked) before.
A/B testing often forces creatives and marketers into silos, each side doing the work needed to get to and through the test, but never truly collaborating.
Multivariate testing acts as a forcing function to bring these two sides of the house together. Because this method requires a strategy to be built around which creative assets will deliver solid results, creatives and marketers have to work in lockstep from the outset.
The hypothesis begets the right creative assets, the right creative assets beget the right concepts, the right concepts beget test results, and test results become a shared data set for the entire team — powering both the next round of creative concepts and testing strategy.
There’s often a communication barrier that comes into play when marketers try to relay which ad creative is working or not after an A/B test.
A common interaction is to show the best-performing ad and ask the creative team to “do more of this.” This leaves creative teams asking, “more of what?”
Multivariate testing eliminates this barrier. All creative variables are isolated, so both the media and creative teams understand which elements are boosting performance. This gives both teams a common language to work from moving forward, one built on data and specificity.
For more, watch the full episode — A/B testing ain’t got nothin’ on multivariate — or subscribe to Resting Ad Face on your favorite podcast platform.
[00:00:05] Jess: Hey, everybody. Welcome to Resting Ad Face. I am Jess Cook. I'm the Head of Content at Marpipe. And I am responsible for all things kind of brand content and creative. And I'm here with Susan Wenograd. My work wife.
[00:00:22] Susan: Exactly. My day is not complete unless I hang out with Jess. I'm Susan Wenograd. I am the VP of Performance Marketing at Marpipe. So I take all of Jess's really great creative and figure out ways to promote it and amplify it.
[00:00:38] So today we are gonna talk a little bit about why AB testing is no longer really the thing that's gonna get you there.
[00:00:48] Jess: Yeah, absolutely. So, you know, I think A/B testing has like been the traditional kind of go-to for, for testing ad creative for a long time.
[00:00:59] [00:01:00] Even before it was automated or digital. But really today it it's just no longer enough. It's not gonna get you like that deep level of creative data that you need. Especially now that we have so many barriers to targeting in terms of privacy and, and opt outs, and iOS 14, things like that.
[00:01:21] So kind of the new modern richer, deeper way to test ad creative nowadays is called multivariate. It's relatively new for people in terms of using it for, for ad creative. And so we wanna talk about, you know, the differences between those two things today and why A/B testing alone is no longer enough, you know, why, what makes multivariate testing different and better.
[00:01:45] And in order to, to know, you know, why that makes sense, you kind of need to understand one, what is multivariate testing? Like what does that even mean? And, and two, what is the design concept behind [00:02:00] multivariate testing and, and why that's important. It's called modular creative. And we'll talk you through that too.
[00:02:04] So, multivariate testing is a testing method that measures every single element within an ad. So not only are you testing the ad itself, you're testing the elements within it, the headline, the background color, the image, the call to action, the button shape, whatever is in there, right? Whatever your heart
[00:02:25] desires to test. And so what that tells you is not only like which ad is the winner, which is what you're getting out of traditional A/B testing, but why, right. You might find that one image, for instance, is always a winner always gets clicks or, or drives up purchase no matter what it's paired with.
[00:02:47] And so that's really the magic of multivariate testing is you're finding those pieces, those creative elements that are the things that people are resonating with — your audience is resonating with — and the things that are [00:03:00] gonna drive performance. Especially now that, you know, targeting is diluted. And so in terms of designing for that, Susan, how, how would someone design for that?
[00:03:11] Because that's very different thinking, right? Mixing and matching things in every possible combination so that you can figure out exactly what's working, you're pairing all variables together. So that nothing is kind of left unaccounted for. Like how would someone design for that?
[00:03:27] Susan: So, you know, traditionally when you'd get an ad, it's just like this one file, you know, and it's like, this is the concept, all these pieces work together.
[00:03:34] And it's viewed as this one holistic thing that you run. When you start thinking about it from a multivariate perspective, that's when it becomes more thinking about it as a group of elements that are working together, as opposed to one thing that just happens to have these elements in it. So what I mean by that is
[00:03:54] you might look at an ad today and say, okay, that's what, that's an ad. That's what it looks like. But if you look at it with a [00:04:00] modular eye, you're like, okay, so they have a headline here. There's a background behind that. There's a model there and there's like a gradient on the background. So you start thinking about it in terms of almost like a template where,
[00:04:12] I always say it's kind of like envisioning Swiss cheese. It's like you have a wire frame and you're just plugging the holes with different versions of roughly the same thing. So it may be, you know, you have six different models, so you might create one layout and the only difference between them is gonna be, you know, those six different models.
[00:04:31] And so traditionally, if you wanted to do this, it was ridiculously manual. Right? So that was why Marpipe was created in the first place. But it was like, you'd have to make six different versions of the ad. You'd need to set it up to make sure that Facebook didn't auto optimize. So you'd have to manually create all that.
[00:04:48] And then just, you know, it was a lot of production work because it's like, you're making six different versions. You're setting up six different ad sets you're assigning them all a budget. So every time you wanted to test a different element, that's how you would have to set it up.
[00:04:59] So [00:05:00] it'd have to be like, okay, everything's identical, make all the versions with the one thing. Okay. Now we've tested that. So we know that this model wins. Keep that model. Now let's test the headline. So it was to get multivariate. It was still somewhat an iterative process. Like it was still kind of sequential testing.
[00:05:16] You could only really multivariate test one thing at a time. But when you're thinking about ads, to get that kind of information. That's where you have to start thinking about it as far as like, it's not just the model you're testing, it's like each thing that's in this wireframe of what your ad's gonna look like,
[00:05:31] those are all testable things. So when you design, you have to keep that in mind and keep it somewhat consistent. So it can't be like, okay, here's my wire frame. And then when you go to create it, you're like, oh, wow, that picture is like way small and when I put it in there, it doesn't look right in the template.
[00:05:49] So it requires a little bit more forethought, I guess. Where it's like before you could kind of tweak things to make it look great in that version, like, okay that version's done now. I wanna make something totally different. Now it's like, you [00:06:00] kind of have to think through how, like A, what am I testing?
[00:06:02] What do I wanna learn and how can I make a wireframe where we can make these elements interchangeable so that it's modular in nature. So it's a little bit of a different way of thinking. I think some people are like, that feels a lot less sexy but it's, I mean, that's how you do it efficiently.
[00:06:16] Jess: Yeah.
[00:06:17] And, and like having done this on both sides, right. Like coming up with like one ad concept as a whole unit is you know, you come up with this great headline you're really excited about and this image and, you know, it looks really, really nice and your eye is drawn to all these places. And the idea is that like modular can still allow you to do that, but you do have to think differently.
[00:06:38] So if you're going to test three different headlines, and three different images, they all have to work together. So I think that's where the forethought comes in, right? What are three different headlines that are gonna work with three different images and all feel cohesive? And so it does, it takes a little bit more planning. You kind of have to have, again, you go back to that hypothesis, you have to have that hypothesis of what you wanna learn [00:07:00] and you have to kind of think about, okay, let's test three different versions of saying the same thing three different ways, right?
[00:07:08] Yeah. Let's do three images that are kind of the same thing, but different subjects. Right? So different looking models or a different colorway of a product. So there is a little bit, again of like that forethought and planning and it all goes back to like that strategy and that hypothesis of making sure that you're learning what you wanna learn and not just testing random things and throwing random stuff.
[00:07:31] Susan: And the more, I think the more you make modular, creative, the easier that gets, you know? It's like you, you kind of refine, like you go into it, knowing that you have to look for those things. I think people get thrown off the first couple times they do it 'cause they're like, oh, damn. I didn't even think about the fact that image doesn't look right with that. You know, it's like, right. You have to think so many more steps ahead. But then I, after a while it kind of becomes second nature for a lot of people where they, they go into it, thinking of those things ahead of time and they just train their brains to go that way.
[00:07:57] Jess: Absolutely. Okay. So we know [00:08:00] what multivariate testing is. We're understanding modular design. And so now I think we can talk through, like, we have kind of four main points of, of why A/B testing alone is no longer enough. So this first point we have is: A/B testing can only tell you which ad wins. Multivariate testing can tell you why. And we kind of touched on this already, right? So it can tell you why, because we've broken it into elements.
[00:08:25] We know that, you know, the headline wins no matter what you pair it with. So that's something that you are gonna wanna continue to use and maybe test more versions of. Yeah. So, you know, I think the beauty of Marpipe is like, we always show you, like, here's your winning ad. But beyond that, here's your winning elements as well.
[00:08:45] Susan: So you, you know, what's interesting about that? I was just talking with engineering yesterday about how sometimes you'll find that your best-performing element of one of those things isn't always in your winning ad. Yes. And it's sometimes it's something very small. Like it could just be like the [00:09:00] background behind a headline, you know, like if you put in an element or something, so it's not anything that's huge. But you're still like, okay, they could turn off the lowest performing ad, but it has a best performing asset in it.
[00:09:11] Jess: And we've seen too, we have some losing ads let's say for CPA that are winning ads for like click through rate, right? Yeah. And so like, you do have to pay attention and losing elements, same thing, losing elements for, for conversion become a winning element for click through right? And so you do have to kind of pay attention to the different, maybe KPIs you're measuring to make sure that like, you know, you're, you're creating some sort of evergreen campaign from these test ads that has a little bit of both.
[00:09:42] And so I think that's a, that's a big thing to look at too, that again, A/B testing won't tell you.
[00:09:46] Susan: Yeah, exactly. I think tied to that too, one of our, our next points about A/B testing is that it really only delivers short-term performance. And this goes back to something we touched on our last episode about [00:10:00] how there's not usually like a lot of historical data on stuff, so you're just kind of recreating the wheel every time you test something. You're like, okay, I tested these two ads, this one's winning, I'm I'm gonna ride or die with this one until it eventually stops working in Facebook, and then you're like, okay, let's just do it all over again. Right? Like it's like this endless cycle of, of these tests that you run, that you're just finding a winning ad and that eventually stops working.
[00:10:21] So you're not building anything for the future. And that's the piece that I love about multivariate is that you're getting learnings, that you can keep reapplying so you're not like re-guessing. And I, I feel like there's this notion that we've come across sometimes when we talk to prospects or customers with the creative teams, who are like, well, I won't be able to be as creative.
[00:10:41] My argument to that is I feel like you can actually get more creative because you're not wasting your time on stuff that isn't gonna work. So it's like, if you know these things work, you can keep those. Here's like all this other stuff that maybe you never thought to test, or you just didn't go that deeply into, because it was like you were more focused on other areas of the ads.
[00:10:57] So it forces you to be creative in different ways. [00:11:00] But that's the data that I feel like saves so much time because you're building up this knowledge about what your brand is when it comes to how people respond to it.
[00:11:09] Jess: I think it allows you to, as a creative prove your ideas work or not. Yeah. Right? And, and we've talked about this before, like we're very, creatives are very emotional people. We have, you know, a concept or a line or, or some sort of art direction that's like our baby. But we, we also wanna know if they worked or not.
[00:11:27] Cause if they, if they don't work, then what's the point, right? So yeah. You know again, A/B testing is gonna tell you that one time, that one ad that you created for that one campaign worked, but you're not gonna be able to pull that forward to the next one. And I don't know, as a creative, that that's uninspiring, right? That I, I don't know anything from the past to be able to use to my advantage to create better creative.
[00:11:51] Susan: It feels like also, it's just so much pressure too, because you're just restarting over and over and over and over. And like, I can't visually design for anything, [00:12:00] like I can write all day long, do not ask me to draw, design, layout. So to me, it's like my brain would look at that and be like, oh God, I gotta restart every single time? Like, yeah, my palms would be sweating, that I had to keep just redoing it over and over and never have any idea, like, great, is this gonna come back to bite me because it doesn't perform?
[00:12:19] Jess: Totally. Creatives always talk about like the fear of the blank page, right? Starting over every time on that new ad, that new idea. This takes some of that out of it. Like if you have that information, you're like, at least I have these Cliffs Notes to go on over here from our testing. And I can just be smarter about it.
[00:12:36] Susan: Yep. Totally.
[00:12:37] Jess: The next one I think is really interesting and maybe something that doesn't get talked about a lot. It's kind of like this just assumed thing that happens because it's the way it's always been done. But A/B testing actually keeps creatives and marketers in silos like working separately. Yeah. Multivariate is kind of a forcing function to bring them together. And so what [00:13:00] I mean by that is like A/B testing, you know, the media team, the marketers kind of get together. They create the brief, the creatives take the brief, they create this ad. We hand it back to the media team. You hear anything? We wait a long time. And then we're like, "Did that thing work?" Right? Yeah. And then we find out it either did, or didn't. We're given optimizations to make based on whether it's performing or not.
[00:13:28] Right, right. And that doesn't really fly anymore. I think if you want to have a high-performing, you know, performance-minded brand marketing team, your creatives and your marketers have to be like firing on all cylinders together. They have to be talking to each other and I, I think that's what multivariate testing does.
[00:13:51] You can't perform without some sort of hypothesis of what you wanna test. Yeah. Which means your creatives can't move forward without knowing what the strategy [00:14:00] is. Which means you really have to work together to figure out, like, what are the assets we're gonna test and then, and then work together to figure out did that work or not, and how to move forward.
[00:14:09] And so I think that's, that's a really big point on the side of multivariate testing, for sure.
[00:14:15] Susan: I, I like that it creates a common language too. Like one of the roles I found myself in, at multiple companies now that I, you know, sometimes you just know how to do something and like, you don't think it's weird and someone's like, gosh, you're really good at blah, blah, blah.
[00:14:27] And you're you. I am like, I, I didn't know that that was a skill like apparently I've been told I do a good job of going back and forth between creatives and data people. So, it's never been hard for me to work with creatives and kind of understand their challenges and be able to translate that for marketers and vice versa. But I find that, you know, so many companies I've worked at these two groups just don't communicate well.
[00:14:52] So like you said, either they're waiting or they try and communicate and they just get pissed at each other because it's like all creatives hear is like [00:15:00] numbers, numbers, numbers, I don't know what that means. They are very visual people, usually. And then the creatives, you know, they're very passionate about what they create and they understand a lot about how, you know, humans interpret design, how to make things stand out, that marketers don't know, and they just speak two different languages and they frustrate each other
[00:15:18] so often. And the thing that I like about multivariate testing is I feel like it gives common ground. You're both talking about the same elements and I think that's the hard part for marketers. They're like, well this ad did well. So just do more of this. And the creative's like do more of what? Like the color? The shape? Like they, you know, it's like, it's not specific enough.
[00:15:37] So I feel like this really gives both sides, a chance to speak each other's language where it's like, here is the visual. Here is the data. You're both seeing it isolated at the same time. So you're both looking at the same thing. And it's not like this frustrating interpretation because it's like, the data is what it is. And I feel like it breaks down that communication problem that just seems to exist with like [00:16:00] every single marketing and creative team I've worked with. Makes me happy.
[00:16:05] I think the last one we had was, this kind of ties back to a little bit of, of what we said, but I think we should go deeper on it is you know, A/B testing only measures a few variants of an ad.
[00:16:19] But with multivariate, you're doing 30 versions, but the data is more consolidated. So instead of getting data on 30 different ads, those 30 ads might be made up of three different elements or four different elements or whatever it is. So you're actually getting better data because it's consolidated to just those elements.
[00:16:37] So instead of like, okay, here's 30 ads and like, 20 of them did fine, not knowing anything. It's kinda like here's 30 ads. Here are the four elements. Here's how each of, one of those did no matter what it was paired with. So it's usable data. And that also gets to the point where it's like, okay, you could actually run more ad versions at that point because the goal is not to [00:17:00] find one variant that works, it's to feed each element into each ad, get the data from it and get asset data instead of add data. So instead of, you know, feeling like, okay, I can only run, but so many at a time and then I'll have a winner and then I'll rotate more against the winner. It becomes less about that. And it becomes more about we wanna scale quickly and we can, if we have the money to spend, we can spend it on testing because we know that our data's gonna be more consolidated, more efficient.
[00:17:25] Jess: It's scalable for a team as well. No matter the size. And I think you know, you're able to kind of do humanly impossible kind of versions of creative.
[00:17:35] You're able to do humanly impossible, like data gathering, because there's just no way that I think that's why we've done A/B or split testing for so long, because it was literally all we could physically do.
[00:17:49] Well, this was fun.
[00:17:50] Susan: Yeah. Good times, man.
[00:17:52] Jess: In short, TL;DR: A/B testing, not enough. Multivariate testing. Try it.
[00:17:59] Susan: [00:18:00] There's the content summary from our content queen. That's great. Well, thank you everybody for joining us once again on Resting Ad Face.
[00:18:07] Jess: We'll be back. Bye.