This is the last post on 2022 readiness in the Series C Growthonomics. So far, we have discussed as the head of data/ analytics how to get more value from your data by creating an analytics agenda that would help you figure out the 10 to 15 problems to work on, and how to operationalize this agenda in your organization
Hopefully, by this time, you would have figured out the top projects for your organization and started the process to operationalize them. Going back to the food delivery app company we have been using as an example, let’s say that the head of finance walks to the Analyst and asks “We have spent half a million on streaming, I want to understand the LTV for them.” The Analyst is excited to hear this as one of the analytics agenda projects is to improve its LTV by 10%, which would yield $20M in incremental revenue.
So, the Analyst goes and looks at acquisition and various retention curves and comes back with the answers. The head of finance puts these numbers in an excel sheet, does shifting data sorting in 10 different sheets, and then says the retention curve you are using might not be the best one since we changed our pricing model last year. Let’s try different numbers, this continues for a few weeks, and the Analyst learns that the company is filing for an IPO later in the year. With this new context, the Analyst starts looking at the numbers again. But the only difference is now, instead of looking at LTV individually, they start looking at LTV for every acquisition channel. After a few weeks of back and forth, the head of finance shows the numbers to the head of product who does not agree with the methodology as we added an option to pause the subscription. There’s a problem, and your retention model is no more holding. So they come in and start tweaking the model and another month passes. Most assuredly, by that time, the marketing person comes in and says, “Hey, we went viral on TikTok during such and such a period, and the retention numbers or the numbers we are assuming are not reflective of what’s realistic.”
If you pull the Analyst to the side and ask how are things going on LTV? He would most definitely say, I don’t know we are changing directions every day, which is just not enjoyable. There is no process to it which reminds me of a quote by Edward Deming:
“If you can’t describe what you are doing as a process, you don’t know what you’re doing.” – Edward Deming
And if we don’t have a process for it this situation will continue potentially till mid-2022 or maybe the end of 2022, and still there will be no finalization of the LTV model.
To avoid this scenario, I want to propose one analytics process and explain what it would look like if that process existed. One such methodology is BADIR framework. There are various subsets within each stage. This process is also discussed in detail in my book “Behind Every Good Decision.” Chapter 4 in the book talks about the entire BADIR framework and many methodologies and their usage.
BADIR, an acronym for five terms
B – Business question
A – Analysis plan
D – Data collection
I – deriving Insights
R – operation Recommendation
But first, let’s see how this process came into life.
I came from the world of academics. My first role was as a ‘senior marketing analyst’ at Adobe in the real business world. The real business world was a shocker for me because, in academia, you have simulated data and your own timelines. But in the business world, they wanted answers yesterday, they have tight timelines, and the data is questionable.
It felt like a rat race. I did not know so much about marketing back then. I noticed that during the course of an analytics project, I would get pulled in different directions by different stakeholders within the same organization. It was pretty confusing for me to understand why various stakeholders of the same organization have conflicting needs and desires.
Though I failed initially in analytics, those failures helped me learn how to make analytics successful. I slowly started sensing the disconnect between the agendas of various departments that caused most analytics projects to break down. Eventually, it struck me that a step-by-step framework can save the day. By the time I left PayPal, I already had this analytics framework in mind. I had worked using this framework for seven years and mint success most of the time.
Going back to the Analyst working on the LTV project to understand how this framework would have made life easier for him.
He would start by asking questions about the need for this analysis. Then this whole information about IPO would have come out early. He would have asked a series of questions like who the stakeholders are? The stakeholders would have gotten roped in before. What actions do you want to take? That would have come in early. Do you need this at a user, cohort, or channel level? All of this information would have come in early. It’s like working backward and understanding how the model will be used? What decision would be made based on the Insights? The intention here is to drive this $20m.
Then if they spend some time in this analysis plan, they will talk about the methodology, hypothesis, and drivers of LTV. They wonder – whether this channel is even a driver of LTV? Organic vs. when you go in and sign by yourself? Should they be treated the same or different? All of these essential hypotheses are made before embarking on the project. They are going to use all of the hypotheses and do proper planning, get all the inputs from all the stakeholders at this point, and at this point, they see finance and products are not looking at the same things eye to eye, and the same goes for marketing. This is the right time and place to align them first on the common view of the problem and methodology.
At this point, they’ll also figure out whether they need this model at a customer level. If they are going to need it at a customer level, they might need machine learning. If yes, what are the constraints? Whatever it is will define their methodology. Next, data collection would be according to the analysis plan depending on the agreed-upon methodology and data definitions. The next step would be deriving insights following a step-by-step approach depending upon the methodology and baking in the business context. Finally, they make an actionable recommendation.
This wouldn’t lead to chaos later because the Analyst spent time defining the problem, aligning the stakeholders, and creating a plan with them. Following a structured approach helped them be better informed and aligned till the end of the process. This is why a process like BADIR is crucial for successful analytics.
[…] your analysts, scientists, and data engineers have an analytics/data science process, meaning a standard method by which they solve any problem? Refining the problem statement, […]