We are surrounded by data. What can we do with that? For what? How? What can we expect and what not? What are the common errors? Size matters? Open source or commercial tools? Here you should find some tips to discover your own journey on the data realm. Bon voyage!
I still remember a nearly-hilarious situation I faced in my first job as data analyst. The CEO came to me and asked me for tons of data very important for a strategic decision. Wow! Panic! Just graduated from College. Just landed in this job. No clue about the business. No clue about data structure. No clue about KPI names. No clue about anything. After some minutes of panic, I deeply breathed and tried to deliver what I’ve been requested. I promised I did the best I could: gathered data from different departments (no Data Warehouse, no unique source of truth) and different people, in very different formats, I used some advanced and fancy stuff in Excel, and, after 10 hours of intense job, I delivered a kind-of-report. I really had no idea what I was doing. I had no idea what data I was delivering. Some days after I went back to my CEO and asked him how useful my data was. His answer was: “Which data? Ah, that report. Well, we did not use it. We took the decision XXX based on a market research the CMO found in a blog”. I suppose I should say thanks.
Sorry. Probably your data-setting is not correct. You should consider start reading this.
After years of experience, probably you’ve heard these stories many times. The Marketing Manager (random Manager example) requires some data. Let’s depict some standard scenarios. The requested data corresponds to…
1. … clicks, sessions, bounces, etc. This one should be easy. The Web Analytics Manager easily performs this task (is part of its basic skill set), probably by applying some complex advanced segments to the data (easy does not necessarily mean simple). Nowadays, the implementations of the web analytics tools trend to be very complex, mainly because they need to cover a lot of business cases. Simple, right? Well, now imagine that for some unfortunate reason, the Web Analytics Manager is on vacation. Panic! Then, the request is given to, let’s say, a Campaign Manager. Of course, he/she has access to the web analytics tool, and hence tries to retrieve the requested data. A bit of panic appears, as the data seems not coherent (of course not, he’s not applying those complex advanced segments he should be applying). He then tries to search for some documentation regarding this topics and… surprise! he/she finds no such documentation. Finally he delivers some numbers, but they all know those numbers might not be totally reliable. At the end, and as the requests get more complex, the process to retrieve such data gets more complex as well. If the process is not clear enough for all stakeholders, the result is a lack of trust on the delivered data, leading to a lack on trust on the data strategy (if such thing exists in the company).
Here I already find my first three motivations:
- Motivation 1: there is a lack of proper documentation. Information transference is virtually inexistent in many e-commerce companies, specially for data-related topics.
- Motivation 2: business complexity translates directly into data complexity. Not every stakeholder understands this implication.
- Motivation 3: wrong data strategy leads to lack of trust and, even worse, to wrong decisions.
2. … revenues, sales, etc. This one gets a bit trickier. The Marketing Manager pings somebody by BI, or by Finance. Traditionally the request is not complete or it’s poorly written: time frames missing, before/after refunds, etc. Normally, such simple requests, requires 2-3 iterations, leading, again, to a lack of trust in the provided data. In some cases, a variation of this scenario takes place: reports and data is built by manually joining data retrieved from different data sources, as the full data map is not clear for everyone.
Again, two more motivations appear:
- Motivation 4: it’s very hard to write clear requests.
- Motivation 5: outside our comfort area, finding data could be a challenge. Even when having a data warehouse, or a nice-and-expensive-but-totally-useless BI tool.
- Motivation 6: it’s easier to request than to retrieve, and it’s easier to retrieve than to process.
3. … data that has been already requested any time (many times?) before. This one is a quite disappointing. There is nothing more frustrating, in both directions, that performing a recurrent request, and being requested for the same time after time. Assume for a second that, indeed, such data is available. Why is data recurrently needed not easily available? Even worse, what if we have (as I mentioned in the previous paragraph) a very nice BI tool? Why some users are reluctant to use self-service data platforms? Now, assume that the data isn’t available. Tough times are about to come: it’s time to reach IT in order to start gathering this data. Normally, from a BI/Data department is very hard to write clear specifications for IT to start gathering some data, due to several reasons: lack of knowledge on the platform, lack of database architecture knowledge, etc.
With this, two further motivations appear:
- Motivation 7: having a BI tool does not ensure self-service. Having a self-service platform does not ensure data availability.
- Motivation 8: communication between BI and IT could be a struggle.
4. … data, or analysis that we don’t know whether it can be accomplished or not, or data which is not clear how is going to be used upon delivery. The first challenge when receiving a request, or when performing it, is to determine whether it can be done or not (assuming whether it makes sense or not). Many of the analysts work directly with data, without designing a plan for such analysis or request. That is, both requesters and analysts work without an analysis framework, even when it’s clear that the analysis will require some time to be finished, probably due to its complexity. A different case appears when the request is coming from the CEO. We have to admit that is very hard to say no to our CEO. However, the CEO does not know everything, and he’s not always right. Even CEO’s requests need to be challenged, understood, and accepted.
With this, we find my two final motivations:
- Motivation 9: working with analysis framework is a must-have.
- Motivation 10: determine whether a request (for data or for an analysis) makes sense. Find the way to challenge every single request.
The Decalogue: the 10 Foundations of Bed & Breakfast Analytics
With my thoughts over the desk, and the motivations I find out of them, I’m ready to state my Decalogue.
1. Burn the silos! Managing data requires transversality and deepness on each vertical. Skill silos are not suitable any more.
2. Complexity matters! Understand how business complexity affects data complexity.
3. Better alone than… No data is better than wrong data.
4. Write, write, and write. Documentation is a must-have. Learn how to document and learn how to request.
5. Going beyond your comfort area. You should consider expanding your comfort area. Even more, you should consider not having any comfort area at all.
6. Bring order to chaos. Narrow your analysis: understand the need, design and framework, and only then, retrieve data.
7. Communication is the key. Your CEO does not care about regression models, decision trees, or how fast your database engine is. He wants a way to keep a sustainable and profitable business.
8. Choose wisely. The right tool for the right set-up. Self-service is not always the best solution.
9. Don’t rush! Data is a path with some mandatory steps. Cheating leads to frustration and lack of trust.
10. So, how are you doing? Integrate data. Move aways from data silos. Design KPIs, reports, and dashboards based on integrated data.
So, what’s next?
With all these, I want to share with you how do I measure, why do I measure and how do I analyze, with the hope that you will join me walking through the learning curve of the data-related world. In a Bed and Breakfast hotel you share your experiences with many others, and you obtain a clean and cheap way for sightseeing. This is exactly what I pretend to do here: every two or three weeks I will be sharing my thoughts, tips, tools, and techniques. Everything what I know will be shared. I’m willing to do so!
With all these, I want to share with you how do I measure, why do I measure and how do I analyze, with the hope that you will join me walking through the learning curve of the data-related world. In a Bed and Breakfast hotel you share your experiences with many others, and you obtain a clean and cheap way for sightseeing. This is exactly what I pretend to do here: every two or three weeks I will be sharing my thoughts, tips, tools, and techniques. Everything what I know will be shared. I’m willing to do so!
Hope you find this interesting, and welcome onboard!