Episode 335 | Lessons Learned from Unbounce’s 5 Split Tests Across 2.3M Exit Overlay Visitors

Show Notes

In this episode of Startups For The Rest Of Us, Rob and Mike talk through an article by Unbounce about split testing their exit overlays. The guys give you a full walk through and their opinions on the findings of the article.

Items mentioned in this episode:

Transcript

Rob: In this episode of Startups for the Rest of Us, Mike and I talk about lessons learned from Unbounce’s five-split test across 2.3 million exit overlay visitors.

Welcome to ‘Startups for the Rest of Us,’ the podcast that helps developers, designers, and entrepreneurs be awesome at launching software products whether you’ve built your first product or you’re just thinking about it. I’m Rob.

Mike: And I’m Mike.

Rob: And we’re here to share our experiences to help you avoid the same mistakes we’ve made. What’s the word this week, sir?

Mike: Well, we are in sunny Las Vegas as people are listening to this, and right now outside it is raining.

Rob: Is it supposed to rain on Tuesday?

Mike: No, I’m still in New England. We recorded a week ago early. Come on, man! Get with the program!

Rob: We’re time warping here. Yeah, so we record just a few days early, and I take off for Vegas in a couple days, and it’s supposed to be what? 70s and 80s? Pretty sunny there?

Mike: Yeah, I think I saw like 84 or 85, something like that last week when I looked.

Rob: It’d be nice. I’m looking forward to getting out and seeing the folks, you know, seeing the speakers, meeting the new speakers, and hanging out with people we haven’t seen for six to twelve months. It’s always exciting to kind gear up for that.

Mike: Yeah, it’ll definitely be interesting this time around just trying to keep up and pace ourselves for the two conferences. I know in years past every single year without fail, my voice starts to fail at about the second or third day. So I’m going to have to be a little careful about that.

Rob: I agree. The good thing is we’re not emceeing the Starter Edition, so we should have less talking. Although, you’re doing a talk there, so you’re really going to need to keep your voice going.

Mike: Yeah, my talk is the last slot on the last day so four or five days of talking and hopefully my voice will still stay with it.

Rob: Indeed. On my end, not too much new this week. Wanted to look at a couple recent iTunes reviews, and we’re up to 517 worldwide iTunes review. This one- I love the subject- “Best podcast ever” from John Turner, and he says “best podcast for anyone looking to start selling software or a SaaS product.” And we have Ray223 who says, “The oldest and the best. Thanks Rob and Mike for all the insights. Keep up the great work.” We have another from [Ludwig?] from Denmark, and he says, “If you’re considering have a startup, listen in here.” Really appreciate these reviews. If you haven’t left us an iTunes review, it goes a long way towards helping us to reach new audiences, to grow the show, and frankly, just keeps us motivated and keeps us from crying ourselves to sleep at night. I know Mike, you often pull up the reviews and read them to feel good about yourself, right? Oh, is it just me?

Mike: I think it’s just you, yeah.

Rob: Cool, so let’s get into the topic this week. We’re going to be talking through an article that’s on Unbounce.com’s blog. And the title of the article is “Lessons Learned from 2,345,864 exit overlay visitors.” And what they did is they ran split tests and some observational tests, which we’ll talk about in a second. They ran five of them over the course of two years, and it’s about 2.3-2.4 million visitors during that time. And so they just outlined the tests they ran, and with the results you would expect, and what actually happened. And I like how detailed this post is, and I like that some of the assumptions that we will make- when I say the split test you’ll think “I know which one is going to win.” Sometimes that’s correct, and other times it’s not. And so it’s kind of a fun walkthrough of people who are doing a lot of pretty sophisticated testing.

Mike: Do you think that that’s an exact number or just kind of a ballpark estimate of the 2,345,864?

Rob: I think they went into Google Analytics, you know, and just pulled up the number from the date of the first split test until today or whatever so I imagine it’s pretty exact.

Mike: So one of the things we wanted to mention before we kind of dive into this particular episode is the fact that there is a certain kind of minimum threshold you have to kind of be at in order for split testing like this to work. And I think Rob, you said the minimum you would even consider doing these kinds of split tests is when you get to about 30,000 uniques a month, right?

Rob: Yeah. You and I were talking before the show, and I ran some loose math, and let’s say these guys have 2.4 million unique visitors to their blog over the course of this 2 year thing, which is what they’re positing. So back of the napkin, 2.4 million is about 100,000 visitors a month, and at that rate, think about if you’re using exit overlay you’re around 1% to start with in terms of converting people to subscribers. So 1% of 100,000 is about 1,000 new subscribers a month. If you split test you could very likely get to 2% and that’s 1,000 to 2,000 and that’s a big difference. That moves the needle. As you start lowering the number of visitors you’re getting, let’s think about 50,000. That number’s going to move from about 500 to 1,000 new emails a month, and that’s still for me, still moves the needle. But as you start dropping down- 30,000? It’s like well, I’m going from 300 to 600. That’s cool. That’s about as low as I would go. If I have 10,000 and I go from 100 to 200 subscribers a month, I actually think at that point you should be focusing more time on generating traffic rather than split testing the traffic you have. So my mental math is somewhere in the 30-40,000 uniques a month if you’re going to start running experiments like this to have it actually move the needle.

Mike: So with that in mind, let’s start digging into these experiments. Let’s see what they found out.

Rob: Yeah. So they say back in 2015 they launched their first ever exit overlay. So it’s an exit intent popup that only appeared when you tried to leave the site. And the idea was they just wanted to collect more email addresses. And when they first started their conversion rate with the very first one, and it says “Conversion optimization tips you’ll actually use. Subscribe to our blog.” There’s no lead magnet, there’s no mini course, it’s just subscribe to the blog. And they were at 1.25% from the start, and we should probably take a quick break here. Some people hate exit intent popups. Some people love them for how well they convert. I think Unbounce takes a pretty good approach to it, and they actually write custom code to assure that you only see one even if they have a bunch running in different categories as people hear later. They were very specific to not hose the user experience because that’s the thing that sucks with exit overlays. If they keep popping up and you’re trying to read and you’re trying to go to another tab to do something, so this is a place where you can listen to these ideas, and you don’t have to do this with exit overlays if personally you don’t want to. This could just as well be a little Drip widget in your lower right, this could be a lead box, this could be embedded form in your website- all the stuff they do could still be applied to that, but they just happen to do it using exit intent.

Mike: Yeah, that kind of applies across the board to a lot of different marketing efforts. If it’s something that you’re not particularly enthralled with when you come across it- if it’s something that you just say “Aw, I can’t stand these things,” and you just keep closing them and they keep coming back, that really turns you off from trying them to begin with. But there’s also a difference between your experience on your website and the visitor who’s coming to your website. So you have to keep in mind what your end goal with these is and whether or not it even makes sense. Obviously you’re not the one browsing your site, but at the same time you are not necessarily representative of the person browsing your site. They may not care nearly as much as you do about it.

Rob: So we’ll kick into first experiment here in a second. In the post they say there are a few testing conditions they want to lay out. Number one: all overlays were triggered on exit, which means they were launched only when abandoning visitors were detected. Number two, they say for these first three experiments they compared sequential periods to measure results rather than true A-B tests. So they do say this is less scientific, more observational because you run them for thirty days and then run them for thirty days and compare the two time periods. They say that when comparing these sequential periods testing conditions were isolated by excluding new blog posts from showing any of the overlays. Conversion is defined as either completing the form, meaning they enter their email or later on they try something where it’s just a click, and we’ll get into that later. And lastly, it’s from January 2015 and ended November 2016, so almost two years. So that’s setting the stage. The first experiment they ran was to go from their generic signups which I talked about earlier which just said “Conversion optimization tips you’ll actually use.” and “Subscribe to the Unbounce blog” button. If you compare that with actually giving away content, the hypothesis is if we give away an e-book it’s going to generate more opt-ins. Both Mike and I would agree and most listeners that you’re going to increase your conversion rate if you do that. So they gave away an e-book called “23 Principles for Visually Designing More Persuasive Landing Pages.” And what happened, Mike?

Mike: Well, it looks to me like the conversion rate doubled from 1.27% to about 2.65%. Now the interesting thing here is you kind of mentioned this upfront- this wasn’t a true split test. The duration for the test itself was about half the time of the stuff they had running. So they ran the initial one for 170 days and then the split tests or the tests that they ran was for 96 days to offer this e-book.

Rob: The variant or the challenger, if you will.

Mike: Yep. And the conversion went from 1.27% to 2.65% so almost half the time and twice the conversion rate which kind of translates to the fact that they got about the same number of conversions in half the time. That’s a phenomenal increase to be honest.

Rob: It’s really cool. Yeah, this is good. I feel like this supports a lot of stuff that we’ve talked about over the years and what you would naturally think so that’s good. Their observation on this is offering tangible resources versus non-specific promises compositively impact conversion rates.

So experiment two gets a little more complicated. So they went from their single field- meaning just getting an email address- and you think about at this point they’re at a 2.65% conversion rate, they’re giving away an e-book, they’re asking for an email. They said that data people always spoil the party and someone internal to Unbounce wanted to get more information, not just the email. So they added three other form fields, so instead of just email they asked for a first name, do you work for an agency, and your company size, and they knew this was going to tank when they knew it was going to tank conversion rates, and it’s exactly what I thought would have intuited as well. Every form field you add from 1, 2, 3, and up, you are going to almost in all cases a decrease in conversion rates. And this once again confirms that. Take us through the results.

Mike: So the results basically went in the opposite direction. Just to point this out as Rob said at the very beginning, these were all run against different people, so it’s like somebody saw one and they ended up seeing another one. It’s really different segments of their audience that were seeing these, so again, the time periods were different that they used. And it looks like they did not want to run this one for very long just because the results were so bad, but the conversion rates went from 2.65% down to 1.25%.

Rob: D’oh! Right back to where they started. They’re still giving away that resource but just adding the three fields just tanked them.

Mike: Yeah, so it’s pretty obvious that even just looking at it you probably don’t want to fill out that form. If it’s in exiting intent and you are already inclined to leave, making it more difficult to send them information is just not going to work out in your favor.

Rob: I like the quote the post author says, “I knew this was for a good reason. Building our data warehouse is important. Still, a small part of me died that day.” And then it got worse. Basically since they added all the fields to it they had to expand the size the overlay and they just made it bigger so it could hold the form fields and then they realized that they had made it so big that the size was too large for many people’s browser windows. So the overlay only fired two out of every three visits. So not only did they have a lower conversion rate, but they cut their potential audience by a third. So then they redesigned the overlay to make it smaller and fit, but internally they decided that even though it was a lower conversion rate, they needed that data. And so now their new baseline is back to 1.25% which is kind of hard as a conversion rate optimized to make that much of a gain to double and then have to go back.

Mike: It depends on your situation because this is one of those places where you really need to make a judgment call about whether you need that data or not and whether you need it now. I think it’s this case with the types of questions they’re asking, I think they could probably get away with not asking certain things. But they went from asking just an email to first name, email, do you work for an agency, and company size. And those individual things really let someone on a marketing team target down and send a highly specific content to those people. You’re not going to send something about a freelancer to somebody who’s running an agency with thirty-five people. So it allows you to target the content you’re sending them and make sure that it is speaking their language and talking directly to their pain points because they’re going to have different pains than a freelancer. And if it’s important to have that information now, awesome. But I would also question whether you need to have it on that page. So one of the things that I’ve seen is that you can do this type of thing where you ask for a very limited amount of data and then on a next page you ask for more because they’ve already bought into the premise at that point. Let’s say they give you an email address and then it flips to another page and says, “Hey, thanks for that information. Can you tell us a little bit more info so we can more specific about the things we send you?” And at that point you can ask them, and if they don’t give it to you, you can follow up later and try and get it later or there’s different ways down the road that you can get it. But the question is do you need that information right now?

Rob: Yeah, and that’s called progressive profiling, if people are curious. There’s a really good tool that actually has a tight integration with Drip that does exactly that. You can set up all these fields and you can progressively over time- the first time you visit it’ll ask you for email, and then the next time it knows it already has your email so instead of doing nothing it’ll adjust for your first name or it’ll ask for something else. It’s pretty sophisticated stuff. That tool’s called Convert Flow, if you’re interested.

Alright, onto the the third experiment. I like this one. It starts off with the quote “It seemed like such a good idea at the time…” So the third experiment is challenging one overlay like they have now, just the one overlay giving away the e-book with the four form fields versus ten different overlays. And the ten different overlays are highly targeted or hyper relevant to each of their ten blog categories. So you can imagine on a blog you might have categories and one of them is split testing, one is copyrighting, one is email, one is paper click, one is social. So creating a separate overlay for each of those with hyper relevant headline/pitch, all that stuff. And so they did this and they said it took three hours maybe to do this, and then they have a table, and the results are kind of crazy. The email category result was .45%. So remember their baseline is currently at 1.25%. So their email is at .45%, their legion and content marketing is at .85% so there’s kind of the bottom end ones. And then their copyrighting one is at 3.22%, and their landing pages and mobile optimization is at 2.46%. So huge range now, but having that more granular data is super more interesting because now you know you have some big wins and some big losers you can just focus more on the losers rather than trying to do it across the board. They only did it for ten days. They said their conversion rate was a combined 1.36%. So it is slightly higher than the 1.25%. They said it eventually crept up to 1.42% after an additional quarter million views, but it didn’t do anything as much as they had hoped. So to take a step back, if you take a look at their copyrighting category they say “How to write killer landing page copy” versus their conversion rate optimization category was “How to Get Started with A-B Testing” and then the next one is UX and so the headline was “The Seven Deadly Sins of Landing Page Usability.” And so I don’t know that these were e-books, and they don’t say that each of these was as effective as that 23 Principles of Design e-book they originally gave away, but that was the idea here. That’s how they were trying to target to the specific category.

Mike: What I find interesting about this is that they decided to just go back to the baseline and use the average of all of these because it seems like there are certain cases where the targeted headlines work a lot better. So for example the copyrighting and the campaign strategy and the mobile optimization- those things have a much higher conversion rate. But then if you look at the other ones, like the email category or the social category, the conversion rate on those is less than 1%. So it makes sense to at the very least go back to the baseline but I kind of question why they didn’t go with those specific ones in those particular categories that were overperforming 2-3x what the baseline was.

Rob: Yeah, I agree. The average was a little weird, and I think they’re doing that just to kind of give you a general idea because now that you have this data, what would you expect is the next thing to do? It’s to keep the ones that are at 2% and above and only try to optimize the low performing categories, and that’s exactly what they did. They were going to test new offers over five categories that had low conversion rates and reasonable enough traffic volume to make it worth the test. Because some of these, now that you’re getting more granular, some might only get 1,000 views in a month so it’s not really worth diving in. What they did is they were then doing resource versus resource, and so they were basically trying a different headline to give away an e-book.

Mike: One thing that comes to mind now that you mention that that kind of leads into the next experiment they did, is if they start changing that and they start running different types of ads in some categories versus others then it changes a lot of the math around and it becomes a lot harder to test that stuff.

Rob: Oh totally. Yep. And so the result of this one, they don’t really go into much detail. They just say they saw a slight improvement in one of them, in a couple of them they saw a dropoff, but in legion and content marketing there was a dramatic uptick, and the results were statistically significant. So they’re making progress. They had five low-performing ones, and they basically knocked two of them out, so really they just have three low-performing ones.

And so for their fifth and final experiment as of this writing, they wanted to test legion overlays, which is what they’ve had the whole time asking for email and first name, etc. versus click through overlays. And so they talk about have an exit intent popup that basically has a headline and then a button that says either “get the video” or “get the e-book”- they’re still giving something away, but then that takes you to a landing page. And that landing page has a lot more text and a lot more conversion-oriented stuff. And so that’s what they tested.

Mike: So to contrast this experiment versus the other ones: the other ones were tweaking the headlines, they were asking for more or less information. And in this particular experiment what they mean by “click through” is they would make an offer to them, and the person did not have to fill anything out. They could simply click on a button that would take them to a landing page that would then ask them for information. So essentially they were providing a mechanism for them to say yes, I’m interested or not, and they could quickly answer that. And when the person clicks on the button it takes them to that landing page and then they can fill out that information so it becomes a two-step process instead of a one-step process.

Rob: Right. And this is actually something Clay Collins has talked about for years, and I heard about his podcast he used to have called The Marketing Show, and then when he came on our show a couple years ago well before the first [?] thing happened, he talked a lot about not popping something up and asking for information and they go with this juiced up opt-in where you can click a button and then the next step was a form and it just feels more- it tends to have better conversion rates and tends to make people feel more welcome and like you’re not just asking for something right up front.

So for this experiment they point out for this to be successful, the conversion rate on the overlays would need to increase enough to offset the dropoff that they expect by adding that extra landing page step. Because you figure 1.25% of people might submit one with the form fields in it, and then maybe after two or three times of having to click the button but now you need to get a lot of those folks to actually fill out the form on the landing page. So not surprisingly engagement with the overlays increased amazingly the ones with the buttons. And engagement is just clicks. And so the one with the form fields was .79%, and the ones with just the button click through was 3.62% so that’s about 4 1/2 times better. So that was just getting click throughs. In the end in their CRO category, the overlay with the button where you click through to the landing page, that essentially netted them twice the leads. So they had 100% improvement in terms of the net leads they received by having someone do a two-step opt-in instead of just popping up a form field in front of them. And in their legion category (they have this category on their blog) it was about a 50% (it was from 45-56) so little under 50% increase. And so they only did it in two categories and their next step would be applying the same format to all the other categories and measuring the results. So two-step opt-in for the win.

Mike: Yeah, so if it’s confusing, to kind of consolidate that, it is a two-step process versus a one-step process, so depending on where you calculate the conversions for that you have to take those into account. And not just the percentages but also the net result of it. So it’s not just about the people who click on that link and go to the next page, it’s about who clicks on the link and goes to the next page and submits their information. And what is the net result of that versus the original. And in both the cases here they had either a 100% or a 50% increase. Both of which are pretty significant especially considering the volume. And they took the time to say okay, this makes sense for us to kind of expand this and go out through the rest of our categories and implement it across the board.

Rob: And so that’s where they leave off. They’re kind of summarizing stuff that ended three months ago, and I’m sure they’re running their next tests as we speak. The nice part about this, I like their concrete examples. There’s so much that you read about split testing that can be vague or rule-of-thumb, and these- your mileage my vary, but a lot of the stuff they intuit really comes to pass, and I think it’s always helpful to get ideas for your own split tests. You may not have time to create ten different overlays, and unless you have 100,000 uniques a month to your blog, I wouldn’t create ten overlays. But especially some of those earlier tests were pretty big wins for not a ton of effort, for writing a new headline and giving away a resource you already have on your hard drive.

Mike: You know, Quickbooks is going to love you for saying “Intuit” all the time.

Rob: Yeah, I know. For using their intuition, yeah. I’m going to come up in their Google Alerts a bunch.

Mike: Yep, their SEO guys are going to start calling you. Interesting side note here, one of the things that Unbounce talks about is that they also have a list of what they’re going to be looking at next. The first one is a charter test which is what happens when they test their on-exit trigger. I guess there’s a 15-second time delay. And then there’s a referral test, which is what happens when they show different overlays to users from different traffic sources. And then the third one they’re looking at is new versus returning visitors. Do returning blog visitors convert better than first-time blog visitors? And I think those are all going to be interesting tests. If you’re interested in taking a look at that, go over and check out Unbounce’s website, subscribe to their blog, and I’m sure they will post out the results when they get those, and we’ll be sure to check it out.

Rob: While you’re at it, go to the lead page’s blog and subscribe too. They turn out good content like this as well. Got to plug the company man.

Mike: I get it. Well, I think that about wraps us up. If you have a question for us, you can call it into our voicemail number at 1-888-801-9690 or email it to us at questions@startupsfortherestofus.com. Our theme music is an excerpt for ‘We’re Outta Control’ by MoOt used under creative commons. Subscribe to us in iTunes by searching for “startups” and visit startupsfortherestofus.com for a full transcript of each episode.

Thanks for listening. We’ll see you next time.

Twitter Digg Delicious Stumbleupon Technorati Facebook Email

Comments are closed.