EPISODE 51 | Guest: Debbie DeWitt, marketing communications manager for Visix
Measuring success is critical for modern communicators, and experimenting with A/B testing gives you data that you can analyze for actionable results to improve your workflows and audience response.
Implementing a well-executed and consistent A/B testing strategy can help you improve your digital signage messages, layouts, designs, imagery, headlines, and more. But there are a few things to consider before just jumping in. You want to make sure you’re testing the right things in the right way. And always keep the focus on the main goal of any communications, which is engagement.
- Learn about the different digital signage elements you can test
- Understand the basic methodology for successful A/B testing
- Get tips on how to focus the scope of your split tests
- Explore the ROI benefits of continuous improvement
- Hear how to parse the data you gather
Subscribe to this podcast: Podbean | Spotify | Apple Podcasts | YouTube | RSS
Get more tips in our Masterclass Guide 2: Digital Signage Communications Planning Guide
Transcript
Derek DeWitt: You might be familiar with the concept of A/B testing, usually written as capital A slash capital B testing. Maybe you use it for email campaigns, webpages, social media, boosted posts, and ads. But how can we use A/B testing methodology to make our digital signs do better? To talk about that I have Debbie DeWitt, marketing communications manager for Visix. Hello, Debbie DeWitt.
Debbie DeWitt: Hi, Derek. Thanks for having me.
Derek DeWitt: Thanks for coming on and thank everybody out there for listening.
Okay. So first off, what is A/B testing?
Debbie DeWitt: A/B testing is basically just showing two versions of something to the same audience under the same conditions and comparing which gets better results.
Derek DeWitt: Okay. Why should people do this?
Debbie DeWitt: Well, I’ll start by saying, if you don’t care about measuring success, then don’t do it. You don’t need to A/B test. I mean, that being said, everyone should care about measuring success. If you’re not looking at what works and what doesn’t work on your digital signs, then you’re just throwing out messages and hoping that they’re doing what you want them to.
Derek DeWitt: And you have no way of knowing if they’re doing what you want them to do. You know, something I’ve said a lot is we care about what we measure. But recently I came across a better phrasing of it, which is that whatever gets measured, gets managed.
Debbie DeWitt: Ooh, nice! That’s very good. Yeah, I’d say if you’ve put all the money and effort into a digital signage system, you should want to know if it’s working. So A/B testing can help with that.
Derek DeWitt: Okay. So, let’s break this down into different parts. So, you’re showing two different versions of something. Say something about that. What are… how are they different?
Debbie DeWitt: You know, your digital signs, what goes up on them is made up of lots of pieces and parts. So first you have to decide, what do I want to A/B test?
It could be anything from just a layout background, or it could be the layout itself; the way things are presented on screen, whether weather (whether weather!) whether your weather is on the right or the left, things like that.
It could be two different designs for an event schedule. Or something as simple as the image or font color in a message.
Derek DeWitt: Yeah. I was going to say, I would imagine that a lot of times it’s going to be about the content itself or the specific wording of a message. Is this wording more effective than that wording? Seeing which one is more effective, which one has more engagement.
Debbie DeWitt: Yeah, I think messages are going to be the primary place where you do this. And I’d say the number one cardinal rule is only A/B test one thing at a time.
If you are looking to figure out, you know, which layout works best and which of these messages works best, or I have four messages that I want to A/B test, only do one at a time.
Derek DeWitt: Right. Because if you make a change on version three and version two and also version four, there’s no way to tell what worked and what didn’t work.
Debbie DeWitt: Right. Exactly.
Derek DeWitt: I mean, is the goal here to eventually come up with this kind of perfect platonic ideal of the most awesome message in history?
Debbie DeWitt: No. No, never. It’s [the] impossible dream. No, communications, like everything, is about, you know, continuous improvement, is about continuing to make it better. It’s not about best, it’s about better.
It’s constantly tweaking, just like you do your website or your email templates or your ads that you’ve published places. You always want to see, you know, can we improve the response to these?
Derek DeWitt: Right. I think that there’s a correlative to this in troubleshooting. I know that you used to work in the AV industry, and I’ve heard you talk many times about, like, doing sound for a stage show and the microphone’s not working. You’ve got to go back and test it in a methodical, reasonable way. You can’t just start randomly doing 15 different things hoping it’ll work. You need to find out where the problem is.
Debbie DeWitt: Yeah, exactly. I mean, you start at one end and you try each thing to see if that’s the problem. I mean, if you have digital signs and, say, a screen isn’t working, you know, you don’t shut off the screen, reboot your player, I don’t know, reassign an admin account to it. You know, change the parameters, change the IP address. You do all of that at once, you have no idea what the problem was. So if it repeats itself, you can’t duplicate the solution.
So, you want to do one thing at a time in your messages or your layout or whatever you’re testing, so that you can tell, hey, that made the difference. Because the difference can go either way, keep in mind.
You could also, if you did multiple A/B tests at once, say it’s a layout background change and the layout, you know, arrangement on screen at the same time, suddenly, people liked the background better, but they didn’t like the layout itself as much. Well, they’ve canceled each other out. So, you actually would think, oh, this didn’t do anything, when really one element of it was improved and one was not.
Derek DeWitt: Right. So, it’s not just for things that have already gone wrong, but it’s also a way to methodically and intelligently optimize what you’re trying to do. Right? Because if you go changing a bunch of things at once, like, boom, and then the whole system… Hey, it’s on fire! What did we do?
Debbie DeWitt: Exactly.
Derek DeWitt: And, of course, you also have to show it to the same audience in order to be able to judge its effectiveness. Audience 1 might react to message A one way, audience 2 react a different way, audience 3 a different way. So again, you’re adding too many variables to the mix. You need to show both A and B versions to the same people…. At the same time of day or what? Like how specific should we mirror each other with the A and the B tests?
Debbie DeWitt: Well, I would say in your example, there is one occasion where you’d go to different audiences, which is I have the same message and I want to see which of my audiences like it better or respond more to it. So, you could have one message sent to different audiences or one message published at different times of day. And that’s actually sort of an audience A/B test, not the message itself. It’s saying which of these audiences or which screen, you know, gets more engagement.
But in terms of, if you really want to test this design against this design, then yeah, you need the same audience to look at it and make that decision. And you can’t know unless you have a call to action or some way to measure it, which we’ll talk about a little bit further, but I would just say, you have to consider your audience.
For example, if you’re testing something that’s supposed to appeal to your third shift workers, you know, only show it to them. Like you said, don’t also show it to first shift workers. And if you do, certainly don’t take that feedback into account, whatever the measure or the feedback is because you’re testing it on third shift workers.
So, stay in your lane, you know. Set up the parameters and you’ve got to know your goal, you know, what does this mean? So, I’d say that that kind of leads into also, what does it mean to do it under the same conditions? You know, it builds on the same thing. You need to show them on the same screen, same time of day, you know, same days. So don’t schedule…
Derek DeWitt: Unless that’s what you’re A/B testing. So again, you don’t have to A/B test messages only. You could A/B test times a day. You could A/B test how long the message stays up. You could A/B test the audience.
Debbie DeWitt: Yeah. That’s absolutely true.
Derek DeWitt: The day of the week.
Debbie DeWitt: Yeah. You’re absolutely true. And that all goes into laying down what am I trying to test? If you’re trying to basically say which of my designs works better, then… You know, if you’re scheduling one for Monday morning and one for Thursday afternoons, especially if it’s like a public place or somewhere like a campus with a migrating audience that varies throughout the day, you’re not going to get the same results. You’re not showing it to the same people under the same conditions. So, you want to do that? Then schedule it for Monday morning and do it the next Monday morning.
Derek DeWitt: What about, what if you have like a lobby, let’s say for example, or a break room, you’ve got two screens side by side or right across from each other, but in the same area. What about putting version A on one screen, version B on the other screen, and then having some kind of feedback mechanism, so people can say, Actually I like that one better?
Debbie DeWitt: Yeah. I mean, that’s the ideal. The ideal is, you know, we can never truly do apples to apples because, you know, at least as far as we know, we live in linear time. So, the fact is….
Derek DeWitt: Well, there are some scientists who would beg to differ.
Debbie DeWitt: Right. But the fact is if you do one [on] one Monday and one the next Monday, conditions have changed. But ideally is, hey, I want to show them both, you know, in the same area and the audience has the option.
Imagine you’ve got two screens in a waiting room and they’re both like full screen so there’s nothing else influencing your message, and you’ve got two playlists and it comes up in red on one, and then a little bit later, it comes up in blue on the other. You have a call to action in there, like a friendly URL, QR tag, whatever it is, take a picture of the screen. And one of them gets a lot more than the other. That’s a good result. But that’s not always going to be possible.
Most places use a screen or two screens, but they’re not showing the same thing because that’s just kind of wasted real estate. So, they do that. So just duplicate the situation as much as you can.
Derek DeWitt: Right. And I think when we’re getting into truly effectively testing, you need to have a specific goal for this test. Don’t test six things at once, test a thing. So, it might be which background is better, but it also might be something else. Right?
Debbie DeWitt: Yeah, you have to define – I said to see which does better – you’ve got to figure out what does better mean. And I apologize, I’m going circular here because that’s really the first thing you do, even though it was the last thing I said.
But you know, as in all communications, the goal comes first. So first, you know, you’ve talked about, you have to say, do I want to test which audience is best for this? Is it out in the lobby? Or is it in the break room? Do I want to test if Mondays or Thursdays are better for this? If mornings or afternoons? And when we say better for this, we mean, what gets more engagement? What gets a better response? And if it’s, do I want to see if this layout or message does better than the other one, it’s really about, you’ve got to have that whole hypothesis before you even start this.
I mean, maybe you have two designs, but before you even start designing, you could have this in mind. I want to try a green against blue. And your goal could be, you know, it’s really engagement. So it could be something like, I want more people to sign up or which message gets more people to sign up or show up to an event. But again, it has to be measurable.
Derek DeWitt: That really comes down to that call to action. You’ve got to have that call to action. And that has to be finally honed.
Debbie DeWitt: Yeah. We’ve got other episodes about crafting your call to action, measuring ROI, so I’m not going to go into details here. But it’s really about, as I said, it’s got to be a measurable goal. If it’s, hey, more people attend an event, you need to show that the reason that happened or didn’t happen (possibly) is because of what you did on your digital sign. You know, if more people just show up, you have no idea why. Maybe it was more word of mouth. Maybe they saw it on the web.
Derek DeWitt: Maybe you also stuck into the newsletter for the first time.
Debbie DeWitt: Exactly. But if they have to register for that event, using a friendly URL, a link in your digital signage message, well…
Derek DeWitt: That they can only get via the digital signage.
Debbie DeWitt: Exactly. And in an A/B test, you can also A/B test your call to action. So, say you’re like, I don’t know, our screens are all mounted low enough people can use their phones to hit QR tags. Let’s do one with a friendly URL and let’s do another one with a QR tag that uses a different URL to register for this event. At the end of the test, you can go, hey, 30 people registered using the friendly URL, 20 people did it using the URL we used in the QR code. It seems that the URL onscreen got more engagement.
Derek DeWitt: Okay. So how can I mess this up? What are some mistakes I can make in my A/B test?
Debbie DeWitt: Um, okay. So, this will be like a little wrap up. So not defining your goal or making the goal something that’s not measurable.
Derek DeWitt: And I think also specific to the digital signage. Like we said, if you’re doing a dedicated landing page for, I don’t know, an event sign up, but you’re advertising that in emails, using leaflets, the digital signage and so on…. Leaflets! Well, some places are…she’s laughing at me, but…
Debbie DeWitt: Leaflets!?
Derek DeWitt: Some places are still using leaflets and little brochures.
Debbie DeWitt: If your town crier is also announcing the event.
Derek DeWitt: Right, if the town crier is out there going, hear ye, hear ye, www.ourcompany.org/, then, you know….
Debbie DeWitt: The 17th century.
Derek DeWitt: 17th century, yeah. But if you’re using the same landing page with all those different ways, you have literally no way to know how people got there.
Debbie DeWitt: Right. All that is, is testing does the call to action work? You don’t know which format of it worked better.
Derek DeWitt: And, again, change one thing at a time, right?
Debbie DeWitt: That’s the next mistake you can make. You don’t have to test every single little change independently though. Let’s not get crazy.
Like if you go, okay, I’ve got one that’s got white Arial bold font, now I’m going to try one with blue Times New Roman italic font. You don’t need to go, okay, I’m going to change it to Times New Roman, now test; okay, now I’m going to make it, you know, take the bold off, okay then test again; okay now I’m going to go to italic, now test again….Like, don’t do that.
Derek DeWitt: Don’t get too granular, right?
Debbie DeWitt: Right. It doesn’t need to be that granular. But you know, obviously, think of it in terms of the design changed, I’m going to change that.
Now you may have something like, I’m going to throw the image on the left of this message in one, and on the right of the image on the other, you know, go ahead and test after that. It’s really about, I mean, designers are smart enough to know what constitutes what’s really substantially different, and just test there.
Derek DeWitt: I think that’s an excellent point. Talk to your designers in your content creators. But you also don’t want to throw up, hey, we’re going to test three different messages and a new layout and a new event schedule, all right now.
Debbie DeWitt: Exactly, exactly. That’s you know, the other thing we said, don’t mix your audiences or the conditions. Try and get them as close as you can. Try and duplicate the circumstances in which the two competing designs are seen as much as possible.
The next mistake people make is failing to consider external factors. This is something we haven’t talked about before. So, say you’re a hotel and you run message A when you’re at half capacity and message B when you’re fully booked; that’s not a fair test. You have a bigger audience.
Derek DeWitt: So like, for example, I know a lot of people are working from home these days. Like maybe, you know that more people work from home on Wednesdays than on Thursdays. So, don’t run a test on Wednesday and then the B version on Thursday because it’s a whole different set up, totally different audiences.
Debbie DeWitt: Yeah. It’d be different people or some of the same people, but some of the other people aren’t there. So, you literally don’t have as many people to respond.
Derek DeWitt: And again, this stuff needs to be measurable, right? Like you need good, hard, solid data.
Debbie DeWitt: Yeah. I’d say another mistake that people make, and I think we do this, I do it, I think a lot of people do it in general, it’s human nature, but don’t let your gut feelings override the actual results.
You know, if you’ve got gathered statistics, pay attention to those. It’s very easy, especially when you’re an expert, you know your company, you know your audience, you know your communications and you’ve gotten gut feelings on what should work. And you probably even, maybe have a favorite. Well, you might get statistics or results back that don’t agree with what you thought they were going to be. You still need, you know, don’t let your gut, don’t just be like, well, this is wrong. You know, I’m sure that people will like this one better. Pay attention to it.
Derek DeWitt: Because I like it!
Debbie DeWitt: Yeah. And the fact is, like I said, that’s why there are a hundred listicles out there about surprising statistics about blank, because we’re surprised by them, because we didn’t expect it.
Derek DeWitt: Yeah, that’s true.
Debbie DeWitt: You know? So, pay attention to those results with one caveat. If you are absolutely sure, I mean, for example, you get 113 responses on something when only 30 people should have responded, you’re going to go, there’s something…
Derek DeWitt: Whoa, wait a minute, we only have 52 people in the company.
Debbie DeWitt: Right. Then there’s something wrong in your experiment. You know, that’s the other thing.
If you put two messages out, and say, we go back to that example of the friendly URL versus the QR code, and one gets just a ton of responses and one gets none, like zero. What you might want to do is just go back and double-check did that QR code work? There’s a fact that if you show a friendly URL, you might need to show it for, say, eight seconds on screen, but with a QR code, you needed to give it time for the QR code to work so you need to have it up for 14 seconds.
So, you might actually find, when you look at your data, oh, we need to tweak the experiment. But I think that you just can’t let your own prejudices or favoritism override real statistics.
Derek DeWitt: Right. Just because you like cats doesn’t mean that putting up a picture of a cat is going to be the most effective way to get your point across.
Debbie DeWitt: I just want to say, putting a picture of a cat is the most effective way to get your point across.
Derek DeWitt: Even people that don’t like cats reacts to cat pictures. Ooh, I hate them!
Debbie DeWitt: I know. Exactly. And the last thing I’d say, in terms of ways you can mess this up, is by doing it once.
And that doesn’t just, I don’t mean like, hey, I tested these two messages and I got my results. What do you mean I have to do it again? I mean, not continuing to experiment, in general, with your content, with your layouts, with your messaging, with your verbiage, with your imagery, with your colors, all of it.
Just because one design beats out another, doesn’t mean it’s the best it can be. I mean, you can always continuously improve your designs, continuously improve your layouts, your backgrounds, your imagery. You know, basically, like we say about everything in every single sector of business or communications, you need to keep continuing to improve and experiment.
Derek DeWitt: Always be improving. Not simply because, well, that’s what all the books say to do, but because you want to give your audience better and better content, that’s more and more focused towards their needs. The ultimate communications system tells people about something before they even know they wanted to know it.
Debbie DeWitt: Yeah. It’s all about engagement and the audience experience.
Derek DeWitt: All right, pretty interesting stuff. I’d like to thank Debbie for talking to me today. Thank you, Debbie.
Debbie DeWitt: Thanks, Derek.
Derek DeWitt: Version B: Thanks, Debbie.
Debbie DeWitt: Thank you, Derek!
Derek DeWitt: Oh, I think I liked the second one better. And of course, we’d like to thank all of you for listening.