The Biases In Online Marketing That Drive Crazy Decisions

The online marketing world is brimming with assumptions that aren't backed up with data. Columnist Brian Massey offers up some strategies for overcoming these biases.

Chat with MarTechBot

ss-puppy-cute-overload

I was pleased to come across an article on Business Insider: “20 cognitive biases that screw up your decisions.” Now that’s a title a Conversion Scientist could love.

The biases described in the article rule the Web. They are embraced, coddled and encouraged by marketers and business owners, like cute little puppies.

Make no mistake: They will grow up to be dogs. They’ll stain your carpets, chew up your furniture, oppress the children and run away at the first chance.

Of course, all owners adapt to our pets. By “adapt,” I mean we employ “choice-supportive bias,” described this way in the article:

That cute puppy will become a carpet-staining, full-grown dog. And you'll still love it.

That cute puppy will become a carpet-staining, full-grown dog. And you’ll still love it.

Biases Look Like Expertise

Your Web team means well. Your designers, developers and consultants believe they are offering good advice based on experience and brilliance. However, without data to back them up, they are just bias zombies looking for the next person to misguide.

No team is immune.

Here are some of the most egregious biases we see in the online marketing world and strategies for removing them from daily decisions.

First And Last Information

We tend to be influenced by the first bit of information we receive and put more weight on the last. These biases can affect where we look for optimization improvements.

A study by Dan Ariely in his book, “Predictably Irrational,” illustrated how random Anchoring can influence us.

He asked his students to write down the last two digits of their Social Security number. Then he had them participate in an auction. Those students who wrote higher numbers bid higher on auction items.

When we set goals for split tests, we secretly anchor ourselves. If we expect a 30 percent increase from one of our treatments, we may dismiss an increase of 15 percent as a failure and try something different instead of building on that 15 percent increase with a similar test.

The Recency Bias was discussed by Daniel Kahneman in his book, “Thinking, Fast and Slow.” It explains how we put emphasis on the most recent thing we experience in any sequence. We tend to forget the results of previous analysis when looking at analytics data.

To counter this in our daily decisions, we rate an idea that could increase revenue based on the amount of evidence there is to support it. This forces us to look at all of the data, not just the most recent query we’ve run.

Perception Is Not Reality

Our perceptions are shaped by a number of biases, and their ability to change reality for us is breathtaking.

For example, best practices often come from reading the literature of our industry. There is an abundance of articles written on what works and what doesn’t on the Web.

You could find several articles that tell you exit-intent popovers increase conversion rates. These come from respected sources, so you might come to the conclusion that exit-intent popovers are a great best practice.

The bias here is the Survivorship Bias. The industry tends to write about winning tests, not losers.

So exit-intent popovers may reduce conversion rates more often than raising them, but we don’t take that into account, since all of the articles we read are positive.

One of our tests showed that exit-intent popovers were not a good idea on one catalog ecommerce site. The owner was reluctant to believe our data until we found additional corroboration.

He had not only developed a survivorship bias but also had anchored himself, as he expected a significant increase in sales. I no longer recommend exit-intent popovers as vigorously, which may be an expression of my own recency bias.

We can be selective in our perceptions, as well. This is particularly common in research.

Many clients come to us saying, “Don’t worry about our mobile traffic. It doesn’t convert.” And if you dive into your analytics, you’ll find ample data to support that — low mobile conversion rates, high mobile bounce rates, low average order values on small screens.

Since mobile is a whole other animal, it’s easy to agree and focus on desktop traffic.

Of course, this is a big mistake, given the trends in the market. When we look at recordings of mobile visitors, we often see impossibly difficult-to-use mobile experiences that can be corrected, and it doesn’t require a responsive website redesign.

Selective perception causes us to see what we want to see in the data. Confirmation bias causes us to consider only the data that reinforce our existing beliefs.

In all cases, the cure is additional data. Never rely on one focus group. Never let analytics alone dictate your decisions.

Work other sources into your decision-making process: click-tracking, session recording and split-tests.

Redesigns Are Big Collections Of Bias

We have rarely told a client that they need to redesign their site to improve conversion rates.

There are two reasons for this. The first is that we don’t learn anything from a redesign. The second is that a redesign is a big ball of bias-driven assumptions, which stacks the deck against us.

The biases we see in the process include the Pro-Innovation Bias, Stereotyping, Overconfidence and Blind-Spot Bias.

Good user experience designers are diligent research-based experts who gain a deep understanding of the audience they serve. However, they are often led astray by the novel, the unique or the innovative.

They overvalue an implementation because it is cool. This is the Pro-Innovation Bias.

Stereotyping may be the most virulent and damaging bias of the lot. Humans are designed to summarize and bucket-ize. However, we’ve been surprised many times when visitors don’t behave the way they are supposed to.

We’ve seen older, supposedly less tech-savvy visitors convert well on mobile devices.

What kind of a house does a person making $100,000 per year live it? A low-paid copywriter may imagine that they live in a large house. The highly paid creative director may assume they would have to rent.

Two stereotypes based on one piece of data. Which is right? Redesigns are based on limited data, and stereotyping is inevitable.

Overconfidence can prevent us from seeing our blind spots, or vice versa. Most designers are not as good at finding supporting data as my team is. They may rely on tools such as surveys and focus groups.

These self-reported forms of data ignore the fact that we lie when asked why we did something or what we would do. This can create a blind spot.

Likewise, we may miss some juicy insights by undervaluing focus groups and surveys. This is our blind spot.

We face the Ostrich effect when we meet a prospect that is already deep into the planning of a redesign. They don’t want to hear that they may be already on the wrong road. I wouldn’t either.

Culture Is The Cure

To prevent these errors in the ranks, I recommend developing a cultural imperative for data-driven decisions. There was a time when collecting data was more expensive than just launching something new and seeing what happened.

This is no longer true. Tools are cheap. Data is plentiful. Let your team experience the confidence of making decisions with it.

We use culture to squeeze out bias. Our company motto is: “Don’t think. Know.”

We freely tell prospects that if we come up with 50 good ideas, fully half of them will not increase their revenue. We make no promises, and we don’t test until we’ve got supporting data in hand.

In short, a bias is a poorly supported assumption colored by our cognitive powers. A best practice is a data-supported discipline applicable to only one website audience. Your best practice is someone else’s bias.



How can you use data to help you overcome these withering biases?


Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.


About the author

Brian Massey
Contributor
Brian Massey is the Conversion Scientist at Conversion Sciences and author of Your Customer Creation Equation: Unexpected Website Forumulas of The Conversion Scientist. Conversion Sciences specializes in A/B Testing of websites. Follow Brian on Twitter @bmassey

Fuel for your marketing strategy.