Beta

How your losing ads can help you design winners

Your low-performing ads contain clues that can help you level-up future ad creative. You just have to know where to look.
Jess Cook

Unpopular opinion: your losing ad creative deserves a second look. They can help point you in the right direction if you know what to look for.

On the latest episode of Resting Ad Face, our VP of Performance Marketing, Susan Wenograd, and I share how to find actionable insights among your lowest-performing ads, and use them to create even more performant winners.

Here are some highlights:

Most of the ads you test will underperform. And that’s OK.

Creative testing is a numbers game. The more ads you test, the more likely you are to find a winner. But, you also end up with a ton of ads that don’t surpass your typical benchmarks. 

That’s a-OK so long as you dive into what made them low-performers and use those learnings in future ad designs. Think of your testing budget as market research. You’re learning what works and what doesn’t, and making data-driven creative decisions to keep building winners.

Good engagement, high CTR, but low CVR? Set better expectations, then test again.

Certain ads are total duds and others deserve a second chance. How do you differentiate between the two? 

First, take a look at the engagement and click-through rates. If your audience isn’t even stopping to engage with an ad, odds are they’re not going to buy either. Time to move on from this creative.

If engagement and click-through are good, but conversion is low, you might want to take a look at your landing page experience. Does it match up with the ad experience? Is there something there (an unexpectedly high price, for example) that might cause someone to bounce instantly? Adding that information to the ad creative and testing them again will help you qualify your buyers and increase CVR.

For more, subscribe to Resting Ad Face on YouTube or your favorite podcast platform.

Transcript

[00:00:05] Jess: Hey everybody. Welcome to the 10th episode of Resting Ad Face. I am Jess Cook, I'm VP of Brand and Content at Marpipe. 

[00:00:18] Susan: And I am Susan Wenograd. I am VP of Performance Marketing at Marpipe, so I am, Jess and I are are one. We work together every single day.

[00:00:27] Jess: That's right. And we always kind of rant and rave about like, you know, ad creative what's, what's in, what's out. And today we're gonna talk about losers. Losers. 

[00:00:43] Susan: Loser. 

[00:00:45] Jess: And specifically losing ads and what you can learn from your losing ads to create winning ads.

[00:00:54] So there's a few, there's a few things that you can look at. So I'm [00:01:00] gonna, I'm gonna hand it off to you, Susan. What is like the top thing that you look at? 

[00:01:05] Susan: So, yeah, so the, I mean, I think the, the interesting thing to note too is like everyone talks about winning creative, but like, most of what you run winds up being losing creative to be, I mean, which like, that's not the sexy thing to say, but it's like the fact is you churn through a lot of versions before you find ones that actually win.

[00:01:22] So I love this topic because it's like a lot of times I've told clients in the past, you need to treat your media money as also market research to figure out and learn what works and what doesn't. Yes, this like, you need to get sales, of course. That's a given. But, just because it, it's a loser, doesn't mean that you didn't learn anything from it.

[00:01:44] So make sure that you're pulling those learnings out cuz otherwise you are just gonna keep spending money and you're never gonna get ahead. So I, I think the first thing that as a marketer, we tend to look at, and that's important from a strategic perspective is are there [00:02:00] things in common with the worst performers?

[00:02:03] So whether it's something obvious, if you pull the worst performers and you say, Oh look, they all have people in them. Or, oh, look, they all look kind of stock imagy-ish versus the UGC stuff. So I tend to be a 30,000-foot viewpoint kind of person first, because otherwise you're just gonna get lost in all these details and then in the end you'll probably be like, I, I still don't feel like I learn anything. So I usually try and figure out, are assets that are similar in nature or you know, we tested knowing that they were similar. A lot of times, you know, when you properly structure a test, you know what you're testing. So sometimes it'll be very obvious what worked and what didn't from that perspective, because your test may have been people versus not people.

[00:02:46] It's like, okay, not people did better, or whatever it is. So if you're testing in that way, going into it with a hypothesis, it's a lot easier to figure out what those are. But if you aren't doing that, a lot of times you can still figure out if you just go [00:03:00] in to Facebook ads and say, okay, let's just filter by our lowest performers for the last 90 days.

[00:03:05] Let's export them and just look at, you know, is it the same copy you see over and over? Is it, you know, one certain visual that no matter where it ran, it just doesn't seem to do good? So you can still do that somewhat manually to figure it out. But that's usually the first thing I'll look at is to figure out, is there something, No matter what we did with it, it just didn't work.

[00:03:28] Jess: We have a great example of this. We have a really hot shoe brand that uses Marpipe to test their ad creative. And something they saw was like all of their bottom performers in this one specific test talked about the fact that the shoe was "washable." That was how it was stated. 

[00:03:44] Susan: Mm-hmm. 

[00:03:45] Jess: And so, you know, if you look at those, it's like, well, okay, then we shouldn't talk about how it's, it's washable.

[00:03:52] Let's move away from that language. But then they compared it to like their top performers and the top headline [00:04:00] was that it was "easy to clean."

[00:04:02] Susan: Ah, yeah. 

[00:04:02] Jess: Right. So it's like, it's, it's just a really subtle difference in the language between a top performer and a bottom performer. So, you know, definitely compare and contrast your, your losing ads, but also look at them through the lens of like, what did win.

[00:04:17] Susan: Mm-hmm. 

[00:04:19] Jess: Because there might just be like a small nuance that people really gravitate toward the way that it was said, or you know, the crop of the image.

[00:04:27] But yeah, definitely look at, you know, what won versus what didn't and try to find a comparative point there as well. 

[00:04:35] Susan: Yeah, and I think that's a big thing too, especially in B2B because a lot of times, there's so many different ways you can describe what something does, and especially for problem solving products where you know, the problems they solve can be talked.

[00:04:51] It's the same problem, but it can be talked about a bunch of different ways. So that's the other thing, to your point, that might need to be your test, where if it's [00:05:00] washable versus easy to clean, like what are all the ways they could possibly say that? And let's run all of those and see which one is the loser so that we know for sure that it's not that that feature isn't desirable, it's that we have to refer to it a certain way. 

[00:05:12] Jess: That's such a great point. I love that. 

[00:05:13] Susan: I think also that happens a lot, like when I said in B2B also is a lot of times in those situations you have product marketers versus the people that are running the media.

[00:05:24] So what the, the product people can sometimes be used to promoting are things that like, you know, larger businesses or, you know, retailers that do large scale wholesale buying, they're, they're a little more focused on the features and they tend to, you know, do the advertising later because they're kind of a middle man.

[00:05:43] Right? So that's the other thing too, is sometimes you'll run to that with, you know, product marketing, is that they know the features really well and they know the benefits of them, but not necessarily in a language that a consumer would immediately grasp. So that's kind of the other piece too, where that can, that can, you know, the marketer can play a part.[00:06:00] 

[00:06:00] Jess: I think that's the nice thing too of like, testing things in a similar category, right? Like, let's test all of the ways we can say this one thing or let's test similar images that kind of fit in a, within a theme together.

[00:06:14] Because then I can go back and see like, was it that whole theme that doesn't work? Or was it just like this one version of the way that it was said, right? Like the, the shoe example. Because if you're testing things that are just like too, you know, wildly different from each other, now we're almost back to A/B testing and we can't really figure out why.

[00:06:36] You know, we might know that, like, okay an apple beat out like a picture of a guy holding an apple, but we don't really know why. Like, it could be the setting behind the guy. Yeah. You know, it, it's just, there's just too many variables again. So like, if you can really be smart about tagging and categorizing your assets you can learn so much more than if you don't.

[00:06:56] Susan: Yeah, you save yourself a lot of time by grouping it into those [00:07:00] kinds of buckets. Yeah. And I think, like I just launched a test where we tested having press mentions in the body copy, but we decided to start testing it in the image copy. And it was sort of the same thing where we wanted to pull three that talked about completely different things because we didn't wanna just run one with different visuals and then if it doesn't do well, say, oh, well, testimonials don't work.

[00:07:26] Right. So we kept the visuals very consistent, but we changed up the, what they were talking about. And then also who the publications probably appeal to. So it's like, one was a fashion magazine, one was more intellectual. So also just kind of looking at does where the press come, come from matter along with what they call out and what they say.

[00:07:45] So same ideas, kind of going deep on the one thing instead of trying to, you know, test four different things and draw conclusions from it. 

[00:07:52] Jess: Yeah, for sure. That's super smart. What did you find out? Anything of note? 

[00:07:56] Susan: I don't know. It just launched, so I don't know yet. 

[00:07:58] Jess: To be determined. 

[00:07:59] Susan: [00:08:00] Yeah, just launched yesterday, so it's gonna run for a week, so we'll see what happens.

[00:08:03] I'm, I'm eager to see what happens though. The, the creative turned out really good, so we'll see. 

[00:08:07] Jess: That's great. I think another thing to look at so this would be like number two, right? Is maybe the ad was a low performer against one KPI.

[00:08:20] Susan: Mm-hmm. 

[00:08:21] Jess: But if you look at it against another KPI, it did really, really well.

[00:08:24] And a lot of these KPIs are like inverse of each other, right? Yeah. Like you might run an ad that does really, really well for clicks but not for conversions, right? Yep. Or, or really do, does well for like engagement, but not for not for CPA. So, and, and a good campaign has a bit, a bit of both, right?

[00:08:43] Like you're trying to engage people in different ways. Yep. Some work better for different audiences, so, you know, you can see really easily that like, oh man, this one ad just didn't convert. But like, oh, hey, look at the click through rate. It's actually pretty decent. [00:09:00] Like it's above average for what we're used to.

[00:09:01] Right. So like, maybe. I need to use that creative just to get someone to a landing page to explain more about the brand or what we do, pull them in a bit more. So you can use that to your advantage and kind of figure out, okay, if this ad is not a winner, if it's a loser in the CPA category, like, but it's doing really well for clicks or engagement, like how can I use that to my advantage?

[00:09:26] How can I not exploit. That sounds bad. How could I capitalize on that? 

[00:09:32] Susan: There you go. Much more positive connotation. Yes. Yeah. And I think some of that too is there's, you know, two to three major things that you can control, and a lot of times when you have good engagement and high click through rate, but poor conversion, sometimes it's a matter of expectation setting on the front end.

[00:09:48] So maybe they see this product looks great and they land and they're like, oh, it's $500. Wow. You know? If you can figure out where is the disconnect? Sometimes it's not the media itself, it's that something is missing that is not [00:10:00] clear to the user until they land and it's, it's it makes it a no-go for them, right?

[00:10:05] It automatically excludes them. So that's kind of the other thing too, is if something is working really well with the different KPI, but the one you ultimately want isn't working, looking for message congruency between the two and figure out how can you incorporate why they might be abandoning into the front end to make sure, and again, that might make your click through rate go down and everything, but it's, you obviously have something there that people are stopping and looking at and engaging with.

[00:10:32] So if you can just gatekeep it a little better to make it clear who it's for. You still have the winner that you've learned something from, but now you're learning something about how to manage the expectation of what they're gonna get once they click on it. 

[00:10:43] Jess: That's so smart. I love that. All right. We have a third and final thing to look at.

[00:10:51] And it's less, less of a thing to look at and more of like a next step. 

[00:10:55] Susan: Mm-hmm. 

[00:10:55] Jess: And that is just because they lost once doesn't mean they're going to [00:11:00] lose some other time. Right? And so, saving some of these losing elements and ads for another day. Try them when you have a sale. Try them when you have a new product.

[00:11:16] Right? And like, let's see if they can become a winner with a different scenario. 

[00:11:23] Susan: And sometimes that's also with your audience targeting too, where there have been a lot of times where products have launched thinking that they're for one group of people and then they're like, oh, like moms are like buying this for something completely not what it was intended for, but they have this huge following for this one thing.

[00:11:41] So it's like over time you might have additional audiences that you're testing things to and that might resonate better for them as well. So as your product evolves and grows and you start getting more users or different users than what you thought you were gonna get,

[00:11:56] you have that to fall back on and I think A, retry what [00:12:00] lost, but B, maybe you just need to go deeper and the concept itself isn't wrong. It's that you have to test a couple different versions of that idea because it's just in the way you're saying it or it's just in the way you're presenting it.

[00:12:13] Jess: How would you differentiate between like, oh, that's something I should test again and that's something I should just throw out? 

[00:12:21] Susan: I think if all the KPIs look bad, you know, kind of to our point where it's like, you know, the, the, the ratio of people that, you know, stopped to look at it if they, like no one watched past the initial three second thing.

[00:12:34] Like, you know, if, if they're not even stopping to engage with it, it's kind of a no-go. Right? Because if, if you're not even getting over that barrier, they're not gonna buy. It's not that it necessarily has to have the best engagement or click through, but there are some where you just launch and you're like, wow, this is way below average.

[00:12:51] You're not even connecting with the user. So in those, I'm kinda like, okay, this just isn't working at all because it's, and usually it's gonna be the visual cause that's what's gonna stop [00:13:00] them. So you're like, this is just not, like this concept is not working. They're not even stopping to look at it.

[00:13:05] So that's usually just like, okay, that's a no go, let's just back seat that for now, because they're not even reading, they're not even watching it long enough to receive the copy, to think that that's what's wrong with it. So that's usually if, if it's just like low across the board, you know, something's just not converting. Normally what I'll do is then look at the softer metrics, like, you know, is it something where the, you know, the three second views were still very high on something like a video, but it just, they just didn't click.

[00:13:33] So you can kind of do a little bit of like an autopsy of like, you know, here are the other things that they could do with this. Did they do any of them or not? And if they didn't do any of them, it's like, okay, this is just flat out not connecting with these people. Let's just back seat that for now. And then if, if they have stuff that was working, then it's kind of like we said before, I usually treat that as, okay, this is a string we could pull a little bit and see

[00:13:55] if we're just not managing expectations well? Is it creative we just sort of [00:14:00] decided to launch and we really didn't take a look at the landing page, and now that we look at it, they don't really feel the same? You know, it's like there's some things that kind of give you indicators that there might be something there, but there are some

[00:14:08] sometimes you just run, you're like, this is just not, this is not a thing for them. They want nothing to do with it. Those are the ones where I'm like, just, let's just move on. I mean, typically we have so many things we could test that it's like you don't need to dwell on the ones that just have no movement whatsoever.

[00:14:21] Jess: That's great. That's a really nice, like clean way to differentiate. Yeah. I also just love the idea of an ad autopsy. There's something about that. 

[00:14:29] All right folks. So TL;DR, love your losers. They're not so bad. You might be able to find some things in there if you're just willing to dig a little bit. 

[00:14:38] Susan: So learn from the losers. 

[00:14:40] Jess: That's right. That's okay. Everyone makes mistakes. Big Bird taught us that. And we can, we can find a silver lining in any loser. Yes. Well, thanks for joining us. We'll be back next time with another episode. More ranting, more [00:15:00] raving. Every two weeks. We'll be here. 

[00:15:04] Susan: See you next time.

[00:15:06] Jess: Bye.

[00:15:07] Susan: Bye.

Boost ad performance in days with a 7 day free trial.
Claim Trial

How to Run a Multivariate Test

The Beginner's Guide

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Tiffany Johnson Headshot

How to Run a Multivariate Test
The Beginner's Guide

Plus, Get our Weekly
Experimentation newsletter!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Outliers is the weekly newsletter that over 10,000 marketers rely on to get new data and tactics about creative testing.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.