A Little Bit of Data Can Be a Time-Consuming Thing
I had an experience over the past week that, in hindsight, I really should have been able to avoid. The situation was basically this: several different people had made comments in passing about how we were probably “overcommunicating” to our database. “Overcommunication” being the tactful way to say “spamming.” In this case, I can actually trace the perception back to at least two different highly anecdotal events, which then spawned comments that led to assumptions, and so on.
Now, I am all for diligent database management, especially when it comes to how often and with what content we communicate with our contacts. My general sense was that we could be doing better, but we were far from reaching a crisis point (I lived through a situation at another company where we did reach that crisis point, and there were plenty of telltale signs leading up to that). “I can pull some quick data on that to at least get some basic facts circulated,” I innocently thought. And, that’s what I did.
I knew going in that, while the data was one thing, the definition of “good” vs. “bad” was likely all over the map, so I wasn’t likely to change many people’s opinions as to the situation by simply sharing the data. So, I shot an e-mail out to a group of interested parties and told them I had the data, and I’d be happy to share it, if they shared with me their opinions as to what an acceptable maximum of communications per week and per month would be.
As I suspected, I got a wide range of responses, and most of the responses had some form of qualifier — well-founded qualifiers regarding the type of communication, actually. So far, so good.
I then shared the data that I had spent 15 minutes compiling in a way to make for easy analysis, still knowing that there was no clear good/bad definition, and there was no clear hypothesis being tested or action being planned that this analysis would influence. The data did show a few things unequivocably — really just highlighting that the concerns were somewhat well-founded and that discussions should continue amongst the people who already tacitly owned the situation. But, it also spawned requests for additional data that was more curiousity-driven than actionability-driven. Several people asked that the data be pulled with their particular qualifiers addressed. Most of these people were in no position to actually take any action based on the results. And, unfortunately, as reporting and analysis systems can sometimes be — applying the qualifiers would have turned the analysis into a highly manual, multiple man-hours exercise, whereas the high-level, basic pull was a 15-minute task.
On the one hand, I could ding our data storage system. By golly, Tenet No. 1 of good BI/DW design is to design for flexibility, right? In this case, the system limitations are actually a boon — they give me an out for simply saying, “No,” rather than the much more involved discussion that begins, “Why?”
It’s a punt, I realize. And not an out I would take if it was throwing anyone in IT under a bus.
My point is that “interesting” can be a Siren Song that dwarfs the pragmatism of “actionability.”