EP 49: How to Optimize Conversions Before Go-Live
Kelly: So welcome back to Thrive, your agency resource. Today, we're going to delve into optimizing conversions but with a little bit of a different spin. My guess is Nitzan Shaer, Co-founder and CEO of WEVO, a digital marketing firm that enables website optimization but pre-live, a term that they've coined internally. Welcome to the show, Nitzan. I'm really excited to have you today.
Nitzan: Kelly, great to be on the show. Thanks for having us.
Kelly: So let's kind of just go right with the problem. We love to set up a problem. What's the problem with launching a website without a really thorough understanding of pre-optimization and also how has the technology changed to kind of address those issues?
Nitzan: So Kelly, marketers and agencies spend quite a bit of time these days optimizing websites. The conversion rate they are trying to reach of visitors arriving at the website turning into customers is an important KPI for them. Until now, most of that work has been done after a website goes live. There is AB testing. There is usertesting.com. There are many tools out there right now that optimize a website after you have gone live and there's a few challenges with that, that agencies and digital marketers find on a regular basis.
First and foremost, there is a lot of effort that's put into designing, coming up with new ideas of what to fix and what's actually going wrong with the website. Waiting for statistical significance. Are you launching a website having these two test run in parallel and waiting for enough people to convert on each one to know which one is the winner.
This process all in all can take weeks, sometimes months for a single AB test. And what we've seen from research is that many of these AB tests actually don't even yield an improvement in conversion. You can have even up to seven out of eight tests that don't yield more than a quarter percent, a half a percent of improving the conversion. So it's a very time consuming labor intensive kind of problems out there today.
Kelly: Yeah and you're talking about sometimes these projects that are in the hundreds of thousands even millions of dollars where there's a lot riding on it so getting it right is critical.
Nitzan: Absolutely. We've had customers approach us and tell us some horror stories.
Kelly: I’m sure you have a lot in your back pocket.
Nitzan: Working really hard sometimes for a period of months to launch a new website for a new product or a new campaign or just a rebranding effort. And then after all of that work is put in, after all the stakeholders of being interviewed, after the designers have done their best and the copywriters and intense fighting about what word is going to be where, with great pompous, this website is launched and what they have seen is a decrease in conversion after all of that effort.
And then there's this egg on the face of the of the CMO of the digital marketer, of the agency sometimes that backed the site. And then a lot of finger pointing starts like no the agency didn't listen to what we said, you didn't listen to what we said. And then this very long process of trying to optimize the process from there on.
Kelly: Right. And you mentioned AB testing a little while ago. When we talked a little bit earlier today, there's sort of this dirty little secret in the industry about AB testing. Can you sort of uncover that secret?
Nitzan: Yeah, I think so again we speak to a lot of digital marketers and agencies on a regular basis here, and what we’ve heard usually after the first or second beer as the stars come out, the ones people don't share when you meet at a conference and talk about how successful we are.
The stories many times are about the very long process that goes into AB testing. The fact that they don't do half as much of AB testing as they would like to do and the amount of time that elapses until they get the answer back from each one of these AB tests so if it's Google or Facebook, they do AB testing in split seconds, and big retailers same thing, but for most companies out there that don't have millions of visitors hitting the same exact web page, this can be a real problem.
On any statistical significant test, depending on the test and the volume and so on and the difference you're aiming for, you can even have potentially tens of thousands of visitors and that isn't the purview of every company out there to have that. By the way we work with some very large companies, even if a company may have millions of visitors in aggregate, they don't have millions of visitors visiting each one of…
Kelly: Particular page. Yeah, and you also mentioned usertesting.com, which is a tool that we absolutely use at my agency probably four five maybe six years ago and talking about statistical significance, what's really the issue with using something like that and relying on that data if you're an agency, a web development agency or digital marketing firm that touches website development and landing page development, what's really the issue with relying on that small subset of data?
Nitzan: Usertesting.com is a wonderful tool as long as you use it for what it was intended for. And what usertesting.com is intended for is similar to surveys or focus group. It's really about getting qualitative information when you know you're in a very structured environment and knowing that the answers you're getting are what 1, 5, 10 people may think about your product knowing that they are on the spot being interviewed by you.
There's no statistical significance to that. You can hear three out of the ten people that you interviewed at usertesting.com say, "I hate this button, why are you asking me to click now." Right and it has absolutely no meaning because it's three people, it's not thirty, happens to be thirty percent of your audience but it's three people.
That thirty percent is completely misleading because it's not statistically significant. I had many people tell me that they implemented results from user testing or findings from usertesting.com and it took them in the completely wrong direction. It's very dangerous to trust those results if you're trying to get statistical significance and actionable results out of that.
Kelly: Right. So for agencies, we have a mandate from our clients to increase conversion obviously every single client comes to us with different KPIs or a different sense of what success is in terms of measurement but we do have a mandate to increase conversion, whether that's informational, requesting information or if it's e-commerce, whatever it is, we want more people to get in touch with us as the client.
You say that most agencies when they go through this entire website design and development process, they are really praying at the end of the day that when they hit that button to go live and they make the DNS switch, they're really praying and so how does your company, how does WEVO actually solve that issue for that reality?
Nitzan: You touched on an important word there which is pray. I've had a number of agency friends of mine confide like after all is said and done, after we invested all this time launching in designing a new website and coding it and cajoling the customer to believe what it is you say is going to happen just before we hit that go button we do a little pray each one to their own god but they do that little session of praying that this is going to work well. The truth is they don’t know and they can’t. They are humans. There isn't a tool out there currently that that tells them this is going to do better or not.
And that was the context for creating WEVO. So we got together a few years ago. I was coming out of the Skype- I joined Skype early on- and we were seeing this pattern again and again of the challenges we just described here that praying before you launch not understanding what the root causes are. And what we did with the number of people we put together this concept around, we all said, there must be a better way.
There must be a way to test these things before you go live, what we call here pre-live testing. And what we do at a high level is fairly simple, we bring an audience to the page itself, and this could be a design of a page or a coded page so we were just fine with the design of the page even before you encode it. And we ask them a series of questions that enables us to build a model of the page, a digital model. This is very similar to what Pandora or Spotify do with the song.
First thing you do with the song after it’s released they create a digital footprint off that song, well, the Music Genome Project as well at Pandora and they know if it is fast, it is slow, what genre, sub-genre so on. We do the same thing for a website. We know if the website is clear or if it's not clear, if it's appealing or not appealing, and to what degree. We know to what degree it's credible and people believe that the website will actually deliver on its promise, to what extent it's relevant to the target audience, and to what extent is experience driving them to take action.
Kelly: And these are all just through a survey or a questionnaire?
Nitzan: Yeah, this is done predominantly through a questionnaire that we put out to the specific target audience. So we have access to about 30 million people around the world, 15 million of those are in the United States. And we have this questionnaire, this experience that we go out to them. They fill out these answers to these questions, and then we sprinkle in the secret sauce and the secret sauce is calibrating those answers to what happens in the real world.
So just asking people. People are terrible at predicting what they're going to do. If you ask somebody in January how many times they are going to go to the gym next year per week, you may hear, "I'm gonna go to gym five times a week." And we kind of like know that's not really going to happen because we have enough experience and we've heard enough people say that, maybe even about ourselves.
So what WEVO does to calibrate those answers is, we take and we ingest to our machine learning algorithm a large number of historic AB tests and these historic AB tests basically calibrate our system and say no back to my example, if somebody says that they're going to go to the gym five times we know that they're going to go to the gym three times a week.
And thus we know to provide that reduction of the rate so we don't ask our audience to say, are they going to buy this product, are they going to convert or not, they don't know that answer. You can't ask them that question. What we do is we ask them a whole lot of other questions that together we know to correlate those to actual conversion.
Kelly: And give me a sense of what those questions could look like.
Nitzan: So it's questions that are related to clarity. It's the questions that are related to appeal of the product. It’s the questions that are related to relevancy to them specifically. There are some quantitative questions. We were asking questions on a scale of one to seven. There's some association questions that we're asking, so we're pulling out both emotional and rational decision-making. You are probably familiar with the Daniel Kahneman think fast, think slow, about two parts of the brain there.
So we're trying to pull that out as well in all of the elements that come together to then drive action in the future. That's the model that we built. It’s a behavioral model. There are two other really important elements that we pull out of this and that is geographic analysis of the page so we can highlight which areas on the page are holding them back from converting so we literally highlight those areas on the page for the digital marketing and say this is an area you want to improve.
Kelly: So is that similar to like heat mapping technology?
Nitzan: Yeah. It takes from heat mapping technology but heat mapping technology has a big challenge. Heat mapping technology shows you where people clicked on the web. Is that good or bad they clicked there? I don’t know. What we do is something very different. It looks like a heat map but is generated in very different way and this is actually the areas that are hindering conversion versus the areas that are accelerating conversion. And that helps you actually take actionable insights from that.
And the third thing that we do so beyond that driver analysis, beyond the geographic analysis, is give you a clear map on the gap between expectations that people had coming to this website and how will that website actually fulfill their expectation. So if they're looking to purchase a credit card, they want to know that the rates so low, the rewards are high. They want to look if this is a reliable institution, and that they're going to get good service if they call up. And we show you to what extent the website actually meet those.
All these things come together in order to enable WEVO to predict for you, this is a simulation with a fairly high degree of accuracy that we've created over time to predict if you have multiple designs, we will tell you which one is actually going to do better. So when you do launch it, you know that with a very high probability this one is going to do better than what is out there right now or better amongst your ideas. It also changes the conversation with your customers as an agency to say hey, here are three ideas, it's not you choose which one is better. It is three ideas. This is what is predicted to do best and why it is predicted to do best, now choose which one do you want to go with.
Kelly: And that's kind of interesting because then on some level the blame game that you were talking about earlier could literally be, we ran this this process and this is the predicted pre-live and this has the highest degree from a probability standpoint of being the most successful but still it's your choice. You could use the one that's the most probably successful or the middle of the road or the least and if your CEO still prefers the look and feel and the aesthetic of the one that is predicted to be the least successful, that's your decision.
So is that part of it like that you're enabling the agencies and the clients to be able to make those decisions they're still empowered to make those decisions but now they have this other predicted and probability information behind them?
Nitzan: Kelly, that's exactly right. We're not here, we think that, those relationships in advance, they are all we hear are providing a tool for the decision makers to make a more intelligent decision that is based on more data, on testing pre-testing before you go live. If the CEO or the executive team or the CMO still says for these strategic reasons I want this option, even though it's predicted to do less because we're going to change customer opinions or because this is what our investors want or this is where the market, whatever the reason maybe, that's legitimate.
And it's their decision to make, but at least they have the understanding of which one is predicted to do better and why. And maybe they just want to tweak something in that option that'll make it better. So that optionality I believe five years from now we will be looking back at twenty-nineteen and saying we're in the dark ages, always at that we launch things without knowing if they're going to be successful. I need to build a bridge. Let me go three bridges and I will test them to see which ones can survive the next hurricane. It is like insane.
Kelly: Well that’s what drew me to WEVO was the fact that it just makes so much sense. It's very logical and you're right I don't really understand how we've been doing things the way that we've been doing them, but it's great that you saw that gap and realize that. So as we start to conclude, why are we even having this conversation? And by that I mean how has web design change from the standpoint of the fact that best practices are just not enough anymore?
Nitzan: I think you're highlighting an excellent point there Kelly. So I think in the first generation of internet websites it was having a site, that was good enough. Next generation, we saw that there's a slew of best practices that if you fill those best practices you're probably good enough. And the agency would sweep in and say you're missing point one five and seventeen. If we fix those things, it will be better and therefore right.
I think now what we're entering 2019 and probably the past couple of years for more advanced industries, best practices don't cut it anymore. It's not good enough. We're entering an area of personalization. We’re entering an area that we have to understand the needs and expectations of our customers in a better way. And just winging it, just eyeballing it, just doesn't cut it anymore. And that's why we're seeing the phenomena that many AB test don't result in better conversion rates because eyeballing and guessing what will work better just doesn't work anymore.
Kelly: Yeah. You could be having not- I won't say poor design or poor copy- but you could be having ineffective designer, ineffective copy against ineffective design, an ineffective copy and you're just kind of choosing or allowing the customers to choose maybe the lesser of the two evils. Right?
Nitzan: Right, that's exactly right. And this isn't to say we have bad designers or bad copywriters.
Kelly: No I didn’t mean that.
Nitzan: Excellent copywriter, excellent designer but they need now more sophisticated tools. It’s like you need a measuring stick that's more refined than it used to be in the past. Understand what are the needs of their customers and how does the specific sub, sub, sub segment at this age group and this income level and this education industry I repeat and so on and so forth. How are they going to respond to my pages, and that's something that it is very hard to uncover that and marketers and agencies are looking for tools these days that will help them find out the answer.
Kelly: Yeah, well this has been a really interesting spin on the conversion optimization conversation and I'm a total geek about it so I really love it and I just want to thank you so much again, Nitzan for coming on the show with me today.
Nitzan: Thank you for taking the time. I really enjoyed the conversation.
Kelly: I did as well.
Nitzan: Thank you.