The Analyst Skills Gap: It's NOT Lack of Stats and Econometrics
I wrote the draft of this post back in August, but I never published it. With the upcoming #ACCELERATE event in San Francisco, and with what I hope is a Super Accelerate presentation by Michael Healy that will cover this topic (see his most recent blog post), it seemed like a good time to dust off the content and publish this. If it gives Michael fodder for a stronger takedown in his presentation, all the better! I’m looking forward to having my perspective challenged (and changed)!
A recent Wall Street Journal article titled Business Schools Plan Leap Into Data covered the recognition by business schools that they are sending their students out into the world ill-equipped to handle the data side of their roles:
Data analytics was once considered the purview of math, science and information-technology specialists. Now barraged with data from the Web and other sources, companies want employees who can both sift through the information and help solve business problems or strategize.
That article spawned a somewhat cranky line of thought. It’s been a standard part of presentations and training I’ve given for years that there is a gap in our business schools when it comes to teaching students how to actually use data. And, the article includes a quote from an administrator at the Fordham business school: “Historically, students go into marketing because they ‘don’t do numbers.'” That’s an accurate observation. But, what is “doing numbers?” In the world of digital analytics, it’s a broad swath of activities:
- Consulting on the establishment of clear objectives and success measures (…and then developing appropriate dashboards and reports)
- Providing regular performance measurement (okay, this should be fully automated through integrated dashboards…but that’s easier said than done)
- Testing hypotheses that drive decisions and action using a range of analysis techniques
- Building predictive models to enable testing of different potential courses of action to maximize business results
- Managing on-going testing and optimization of campaigns and channels to maximize business results
- Selecting/implementing/maintaining/governing data collection platforms and processes (web analytics, social analytics, customer data, etc.)
- Assisting with the interpretation/explanation of “the data” — supporting well-intended marketers who have found “something interesting” that needs to be vetted
This list is neither comprehensive nor a set of discrete, non-overlapping activities. But, hopefully, it illustrates the point:
The “practice of data analytics” is an almost impossibly broad topic to be covered in a single college course.
What bothered me about the WSJ article are two things:
- The total conflation of “statistics” with “understanding the numbers”
- The lack of any recognition of how important it is to actually be planning the collection of the data — it doesn’t just automatically show up in a data warehouse
On the first issue, there is something of an on-going discussion as to what extent statistics and predictive modeling should be a core capability and a constantly applied tool in the analyst’s toolset. Michael Healy made a pretty compelling case on this front in a blog post earlier this year — making a case for statistics, econometrics, and linear algebra as must-have skills for the web analyst. As he put it:
If the most advanced procedure you are regularly using is the CORREL function in Excel, that isn’t enough.
I’ve…never used the CORREL function in Excel. It’s certainly possible that I’m a total, non-value-add reporting squirrel. Obviously, I’m not going to recognize myself as such if that’s the case. I’ve worked with (and had work for me) various analysts who have heavy statistics and modeling skills. And, I relied on those analysts when conditions warranted. Generally, this was when we were sifting through a slew of customer data — profile and behavioral — and looking for patterns that would inform the business. But this work accounted for a very small percentage of all of the work that analysts did.
I’m a performance measurement guy because, time and again, I come across companies and brands that are falling down on that front. They wait until after a new campaign has launched to start thinking about measurement. They expect someone to deliver an ROI formula after the fact that will demonstrate the value they delivered. They don’t have processes in place to monitor the right measures to trigger alarms if their efforts aren’t delivering the intended results.
Without the basics of performance measurement — clear objectives, KPIs, and regular reporting — there cannot be effective testing and optimization. In my experience, companies that have a well-functioning and on-going testing and optimization program in place are the exception rather than the rule. And, companies that lack the fundamentals of performance management that try to jump directly to testing and optimization find themselves bogged down when they realize they’re not entirely clear what it is they’re optimizing to.
Diving into statistics, econometrics, and predictive modeling in the absence of the fundamentals is a dangerous place to be. I get it — part of performance measurement and basic analysis is understanding that just because a number went “up” doesn’t mean that this wasn’t the result of noise in the system. Understanding that correlation is not causation is important — that’s an easy concept to overlook, but it doesn’t require a deep knowledge of statistics to sound an appropriately cautionary note on that front. 9 times out of 10, it simply requires critical thinking.
None of this is to say that these advanced skills aren’t important. They absolutely have their place. And the demand for people with these skills will continue to grow. But, implying that this is the sort of skill that business schools need to be imparting to their students is misguided. Marketers are failing to add value at a much more basic level, and that’s where business schools need to start.