The Secret Sauce Of Marketing Analytics: The 5W1H Framework
Today, I am going to reveal my secret sauce for marketing analytics. This simple method is so powerful that anyone can start applying it today to any sort of descriptive analysis, and it will start producing results from day one. What’s better, this method does not require any sort of prior knowledge or skill set […]
Today, I am going to reveal my secret sauce for marketing analytics. This simple method is so powerful that anyone can start applying it today to any sort of descriptive analysis, and it will start producing results from day one.
What’s better, this method does not require any sort of prior knowledge or skill set such as statistics or database querying: anyone from a new grad to the seasoned veteran can use it to get better results.
The catch? It’s hardly a secret; everyone knows how to do it. In fact, you probably learned it back in primary or secondary school. The method I am talking about is the 5W1H framework.
The 5W1H Framework
This framework is used most often in journalism, but applies to any information gathering practice. It consists of six questions that together form a complete story of an event:
- Who is it about?
- What happened?
- When did it occur?
- Where did it take place?
- Why did it happen?
- How did it occur?
A similar framework can be applied to a marketing event:
- Who (which audience/segment) was affected?
- What (KPI or metric) was affected?
- When (week/day/hour) did the event occur or start?
- Where (campaign/creative/email/webpage/etc.) was the change seen?
- Why did it happen; what was the cause?
- How can the results be applied on a broader scale to form best practices?
The single crucial point in using this framework is to drill down as deeply as possible for each question, particularly the first four.
Rushed Investigations Are Risky
For too many marketing analyses I have seen, a large part of the first four questions — who, what, when and where — are glossed over, while the pressure is on getting the answer to “why” as quickly as possible.
The problem? Neglecting those first four questions can lead to wildly incorrect conclusions.
Think of it this way. Answering the question, “why did the performance drop occur?” with “landing page changes” is akin to answering the question, “why did the rocket launch fail?” with “mechanical defects.” It doesn’t explain anything, there are no learnings to be gained, and the mistake cannot be prevented if it were to happen again.
Even worse, skimping on the investigation may lead not only to insufficient conclusions, but also to incorrect ones.
In today’s fast-paced world of digital marketing, there are often multiple changes being made at once across dozens of accounts and properties with tests being run in parallel. Without sufficient investigation, the effect of one change can easily be mistaken for the effect of another. This can lead to long-term losses, as bad practices are mistakenly instated and good practices skipped over.
Below, I will detail how to use the 5W1H framework to ensure that your conclusions are as accurate as they can be.
Steps For Using The Framework
In preparation for the drill-down, I recommend that you start with a report, database, or some information source that has as comprehensive a dataset as possible. If you are using Excel and all the relevant data does not fit at once, then you may need to download multiple reports as the investigation progresses.
For most marketing analyses, I recommend investigating in the following order: When, Where, What, Who. Only after completing these four should you move on to Why and How.
If you are already investigating some event, you already have a sense of when it occurred. But dig deeper: the question may start out as a week-over-week change in performance, but find out which specific day the change occurred. You may even try figuring out the approximate time of day if there is data available.
Once you determine the “change point,” decide on two time ranges to compare against: one for post-change, and one for pre-change (baseline). Getting this step right is critical to the rest of the investigation.
Set both time ranges equal in length, and as long as possible, in order to increase the amount of data being analyzed. They should also be as close in sequence to each other, but if there are known time series patterns such as day or week, try to account for them (for example by comparing data for Mon/Tue of last week vs. this week, as opposed to comparing Sat/Sun vs. Mon/Tue).
Which specific campaigns, creatives, keywords, emails, site pages, etc., were affected? This is where starting out with a granular dataset comes in handy. Drill down until you can drill down no more: until there is no finer level of detail that seems to be causing a large portion of the observed change. Having longer time ranges helps here because with too little data, variance (noise) can quickly overwhelm the actual impact.
Keep in mind that the objective is to clarify the scope of the change, not to narrow it down. Do not force a drill down when not justified; a change occurring for a number of campaigns at once is revealing in itself.
“Which KPI or metric was affected?” Does this seem like a simple question? Not when you drill down. Using the example of a CPC-priced paid campaign, we can break down ROI thusly:
= Revenue / Cost – 1
= (Orders * AOV) / (Clicks * CPC) – 1
= (Clicks * Conversion Rate * AOV) / (Impressions * CTR * CPC) – 1
The key here is to never be satisfied with looking at change in a single, high-level metric. Almost all KPIs can be broken down into finer metrics, and depending on which component metric caused the change in the higher-level metric, the investigation may go down a completely different path.
Now that you’ve figured out the When, Where, and What, you start to see characteristics of the audience, what segment they belong to, and what part of their purchase journey was impacted.
This is the part of the analysis in which results are put in context of behavior — start thinking about what may have caused the observed change. For certain channels based on audience targeting, such as email or retargeting, Who may overlap with Where.
This is where it all comes together. If you have come this far, then you have discovered when the change occurred, its scope, which metrics were impacted, and how the audience was affected.
From here, all you need to do is to tap into your inner Sherlock Holmes, and ponder on what could have caused the change exactly the way you saw. Experience counts for this step, since the more of these patterns and situations you have encountered, the faster you can connect the dots.
If the cause of the change does not seem to reside within your area of management, then you may need to reach out to colleagues to see if they are aware of anything that could have caused the change.
Lastly and most importantly, take the result discovered above and generalize it to form best practices. How can we apply this success on a broader scale? How can we prevent this situation from happening again? If these questions cannot be answered, then the investigation has been of limited value.
However, this does not mean that one should feel pressured to answer How when the investigation for Why was rushed or incomplete. Forcing the wrong answer to Why will result in a wrong recommendation for How, which is oftentimes worse than having no recommendation at all.
Digital marketing is all about testing and iterating, and sometimes it feels as though we cannot wait to get results and move on to the next initiative. However, obtaining speed at the price of accuracy defeats the purpose. Oftentimes, it is well worth the effort to investigate thoroughly and obtain quality learnings that compound over time.
For Those Asking Questions:
Focus not on getting answers, but on getting learnings. Do not demand immediate answers, but rather expect a reasonably detailed answer within a reasonable time frame. If action needs to be taken quickly, ask for a preliminary answer within a short time frame, and detailed results later on.
For Those Trying To Answer Questions:
Do not feel pressured to be able to answer all questions immediately, but rather focus on answering each question well, and deriving solid actionable recommendations. Understand the potential magnitude of impact of your answers, and proactively request more time/resources/data when justified.