Dear Marketer: Every Average Lies, You Must Go Deeper

Written by amitpsharma | Published 2020/07/03
Tech Story Tags: marketing | performance-marketers | conversion-optimization | growth-marketing | digital-marketing | data-analysis | analytics

TLDR Averages are an easy way to aggregate information so that others can see it more easily. But they offer little to no insight. Giving rise to the need for further dissection of numbers and going deeper. Dissect by Segment — You need to look at the data per segment. Dissect the outliers & statistically insignificant stuff. Rule out outliers in your analysis of your data. Examine the bounce rate and bounce rate of all the pages listed here to be pretty much useless. Find out what could be wrong with these pages and compare them with the top product.via the TL;DR App

“It’s a basic truth of the human condition that everybody lies. The only variable is about what…”
“Truth begins in lies…"
These quotes are of Hugh Laurie, who played one of the most iconic characters in House M.D. These words could potentially become one of the commandments for a marketer if we simply replace “Everybody Lies” with “Every Average Lies”.
The commandment would be like: Every Average Lies, Dig Deeper.
Marketing dashboard and website analytics reports are full of averages. But in real life no one is “average” and no user experience is “average”. Averages are often talked about since they are an easy way to aggregate information so that others can see it more easily. But they offer little to no insight. Giving rise to the need for further dissection of numbers and going deeper.
Say John Doe comes to you and tells you — “Our average time on site is up by 30 seconds! It’s 2 minutes 30 seconds this month”
You need to always keep in mind that when looking at a metric, ask yourself, “what will I do differently based on this information?” So now when you look at the average visit duration, do you know any better what to do next? Any brilliant insights? You didn’t think so, right.
What should you do instead?
#1 Dissect by Segment — You need to look at the data per segment. 
Let’s say you’re looking into average revenue per visit which is an important KPI for e-commerce stores. Say it’s $100 for John Doe’s store.
This would have been more insightful if you look at it per traffic source, you get more context.
Now you realize the average revenue per visit (ARV) for organic visitors is $150 while accounting for 15% traffic while ARV for referral traffic is $75 while accounting for 30% traffic.
This certainly would help you to make a conscious call to boost your organic efforts while revisiting the referrals platforms for optimizations.
#2 Dissect by Distributions — You need to look at distributions
Distributions show what makes up the average and look at the numbers in a much more manageable way.
All web analytics suites show average page views per visit. Let’s say John now comes up to you and says the average page views per visit is 2.89.
Now, this leads to a question: Is the more the merrier true for this case? Are more pageviews better? Probably?
Well, what about looking at it from the perspective of conversions. How do you do that you segment the keeping Page Depth as a dimension and Unique Vistors and Conv. Rate as the metrics.
Visitors who visited only 1, 5 and 7 had conversion rates 0.1%, 1.02%, and 12.5% respectively. Now you will be able to see for yourself that Page Depth is proportional to conversion rates. Meaning that increasing pageviews per visit would probably be useful for us.
Distributions are also insightful in the case of “totals”. So instead of just looking at total transactions or leads. You could look at the number when it’s distributed by “visits to transaction/leads”.
Learn that most people are ready to buy during their first visit for most eCommerce categories. For some luxury items and real estate, the numbers would be much different, with lesser weightage on first visits.
#3 Dissect the Extremes — You need to rule out outliers & statistically insignificant stuff
By definition, normal averages tend to favor the outliers. You need to factor this in your analysis.
Let’s say you want to identify your pages with the highest bounce rate. That is: go to Site Content -> All Pages and sort by Bounce rate.
You would find the data that gets displayed to be pretty much useless. Why? Cause all the pages listed here have very low visits and high bounce rates.
The data for these pages would not be statistically significant. So we need to use filters to exclude all low traffic pages.
You can click on ‘Advanced’ and let’s set the minimum number of unique pageviews to a given number that suits your needs say 500 or 1,000 and go ahead with the analysis.
And then you can actually check out what could be wrong with these pages — and compare top bounce product pages to lowest bouncing product pages to form a hypothesis as to what might be wrong.
Things to further keep in mind while analyzing:
1. Use total numbers next to ratios: Let’s say John’s looking for the best converting traffic sources on his website. On looking at the numbers, we find that Referrals are the best with a higher conversion rate than Organic.
But when we add more context around it by adding the number of visitors and transactions. We find the “top 5” converting sources account for less than 10% of total transactions.
Ergo, having total numbers (transactions) next to the ratios (conversion rates) will give us more context.
2. Choose the right metrics for analysis: The key is to identify metrics that provide actionable insights. You need to be able to look at a metric, ask “so what”? — and have an answer.
“Our top ‘exit’ pages are blog posts”. So what? John, we need to do a better job directing blog traffic elsewhere after reading the post.
“Conversion rates for our top Instagram campaigns are way up”. So what? John, we should increase our budget for Instagram campaigns.
“Our average time on site is up by 30 seconds!” So what? Em…err.. no idea, John.
Ergo, if you have an answer to John’s “So what?”. You’re looking at the right metric, else you’re not.
Long story short. Humans are prone to have selection bias, outliers, false-positive bias confirmation bias and so one.
The list is long and full of terrors.
Accounting for biases, data should be looked granularly and deeply.
In addition to discussed dissections, one can dissect with age, gender, site-searchers, location, network, browser, device, and so on..
The list is long and full of colors.
Each color of thoughtful dissection leads to a beautiful masterpiece of an analysis.. :)

Written by amitpsharma | Conversion Optimizer | Marketer| Co-founder at Ekamoira Tech | Ex-Merkle
Published by HackerNoon on 2020/07/03