General

Am I Ever BeHIND on Posting…

August was a little crazy for me:

  • I changed jobs — left Nationwide to become Director, Measurement and Analytics at Resource Interactive — which is 1000% the “right” move, but meant for a hectic/stressful month
  • Back-to-school time, which was more than just getting our kids ready — my wife ran our two sons’ elementary school’s entire supply sale…and my “I’ll show you a few tricks in Excel to help you stay organized” offer morphed into a full-blown custom ERP system built in MS Access; August was the month when all the supplies arrived (think almost 10,000 no. 2 pencils…) and had to be divvied up; I did no divvying, but there were a number of late-breaking report requests; at last count, the database had over 20 tables (it’s almost a fully denormalized database), over 40 queries, 12 forms, and 20+ reports; AND…it’s now been extended to also handle the production of the school’s student directory; gotta love MS Access!
  • Company, company, company — two visits from friends in Texas, two visits from my parents, a visit from my in-laws, and my mother-in-law moved in for six weeks to convalesce from surgery…all in a 3-week period in August

I’ve got one more good customer data management post in me that needs to get written, at which point I expect to be shifting over to more web analytics-y, social media measurement-y posts going forward.

And…as I played around with Drupal for a couple of projects over the past couple of months, I realized that the theme that I settled on after weeks of experimentation on this blog…is one that was built for WordPress to mimic one of the Drupal default themes! How embarrassing!

Please be patient! My life will settle back down soon (I hope). In the meantime, if you’re going to be in Columbus in the middle of September, consider stopping by this month’s Web Analytics Wednesday on September 16th!

General, Presentation

PowerPoint / Presentations / Data Visualization

I wrote a post last week about PowerPoint and how easy it is to use it carelessly — to just open it up and start dumping in a bunch of thoughts and then rearranging the slides. That post wound up being, largely, a big, fat nod to Garr Reynolds / Presentation Zen. Since then, I’ve been getting hit right and left with schtuff that’s had me thinking more broadly about effective communication of information in a business environment:

Put all of those together, and I’ve got a mental convergence of PowerPoint usage, presenting effectively (which goes well beyond “the deck”), and data visualization. These are all components of “effective communication” — the story, the content, how the content is displayed, how the content is talked to. In one of Reynolds’s sets of sample slides, you can clearly see the convergence of data visualization and PowerPoint. And, even he admits that this is a tricky thing to post…because it removes overall context for the content and it removes the presenter. Clearly, there are lots of resources out there that lay out fundamental best practices for effectively communicating in a presentation-style format. Three interrelated challenges, though:

  • The importance of learning these fundamentals is wildly undervalued — it sounds like Abela’s book tries to quantify this value through tangible examples…but it’s a niche book that, I suspect, will not get widely read by the people who would most benefit from reading it
  • “I need to put together a presentation for <tomorrow>/<Friday>/<next week>” — we’re living under enormous time pressure, and it’s incredibly easy to get caught up in “delivering a substantive deliverable” rather than “effectively communicating the information.” When I think about the number of presentations that I’ve developed and delivered over the past 15 years, the percentage that were truly effective, compelling, and engaging is abysmally small. And that’s a waste.
  • Culture/expectations — every company has its own culture and norms. For many companies, the norms regarding presentations are that they are linear, slide-heavy, logically compiled, and mechanically delivered affairs. For recurring meetings, there is often the “template we use every month” whereby the structure is pre-defined, and each subsequent presentation is an update to the skeleton from the prior meeting. Walk into one of those meetings and deliver a truly rich, meaningful, presentation…and your liable to be shuttled off for a mandatory drug test, followed by a dressing down about “lack of proper preparation” because the slides were not sufficiently text/fact/content-heavy. <sigh>

What’s interesting to me is that I have spent a lot of time and energy boning up on my data visualization skills over the past few years. And, even if it takes me an extra 5-10 minutes in Excel, I never send out something that doesn’t have data viz best practices applied to some extent. As you would expect, applying those best practices is getting easier and faster with repetition and practice. So, can I do the same for presentations? And, again, that’s presentations-the-whole-enchilada, rather than presentations-the-PowerPoint-deck. Can I balance that with cultural norms — gently pushing the envelope rather than making a radical break? Can you? Should you?

Reporting

Performance Measurement — Starting in the Middle

Like a lot of American companies, Nationwide (Nationwide: Car Insurance as well as the various other Nationwide businesses) goes into semi-shutdown mode between Christmas and New Years. I like racking up some serious consecutive days off as much as the next guy…but it’s also awfully enjoyable to head into work for at least a few days during that period. This year, I’m a new employee, so I don’t have a lot of vacation built up, anyway, and, even though the company would let me go into deficit on the vacation front, I just don’t roll that way. As it is, with one day of vacation, I’m getting back-to-back four-day weekends, and the six days I’ve been in the office when most people are out…has been really productive!

I’m a month-and-a-half into my new job, which means I’m really starting to get my sea legs as to what’s what. And, that means I’m well aware of the tornado of activity that is going to hit when the masses return to work on January 5th. So, in addition to mailbox cleanup, training catch-up, focussed effort on some core projects, and the like, I’ve been working on nailing down the objectives for my area for 2009. In the end, this gets to performance measurement on several levels: of me, of the members of my team, of my manager and his organization, and so on. And that’s where, “Start in the middle” has come into play.

There are balanced scorecard (and other BPM theoreticians) who argue that the only way to set up a good set of performance measures is to start at the absolute highest levels of the organization — the C-suite — and then drill down deeper and deeper from there with ever-more granular objectives and measures until you get down to each individual employee. Maybe this can work, but I’ve never seen that approach make it more than two steps out of the ivory tower from whence it was proclaimed.

On the other extreme, I have seen organizations start with the individual performer, or at the team level, and start with what they measure on a regular basis. The risk there — and I’ve definitely run into this — is that performance measures can wind up driven by what’s easy to measure and largely divorced from any real connection to measuring meaningful objectives for the organization.

Nationwide has a performance measurement structure that, I’m sure, is not all that unique among large companies. But, it’s effective, in that in combines both of the above approaches to get to something meaningful and useful. In my case:

  • There is an element of the performance measurement that is tied to corporate values — values are something that (should be) universal in the company and important to the company’s consistent behavior and decision-making, so that’s a good element to drive from the corporate level
  • Departmental objectives — nailing down high-level objectives for the department, which then get “drilled down” as appropriate and weighted appropriately at the group and individual level; these objectives are almost exclusively outcome-based (see my take on outputs vs. outcomes)
  • Team/individual objectives — a good chunk of these are drilldowns from the departmental objectives. But, they also reflect the tactics of how those objectives will be met and, in my mind, can include output measures in addition to outcome measures. 

What I’ve been working on is the team objectives. I have a good sense of the main departmental objectives that I’m helping to drive, so that’s good — that’s “the middle” referenced in the title of this post.

The document I’m working to has six columns:

  • Objectives — the handful of key objectives for my team; I’m at four right now, but I suspect there will be a fifth (and this doesn’t count the values-oriented corporate objective or some of the departmental objectives that I will need to support, but which aren’t core to my daily work)
  • Measures — there is a one-to-many relationship of objectives to measures, and these are simply what I will measure that ties to the objective; the multiple measures are geared towards addressing different facets of the objective (e.g., quality, scope, budget, etc.)
  • Weight — all objectives are not created equal; in my case, for 2009, I’ve got one objective that dominates, a couple of objectives that are fairly important but not dominant, and an objective that is a lower priority, yet is still a valid and necessary objective
  • Targets — these are three columns where, for each measure, we define the range of values for: 1) Does Not Meet Expectations, 2) Achieves Expectations, and 3) Exceeds Expectations

It’s tempting to try to fill in all the columns for each objective at once. That’s a mistake. The best bet is to fill in each column first, then move to the next column.

This is also freakishly similar to the process we semi-organically developed when I was at National Instruments working on key metrics for individual groups. Performance measurement maturity-wise, Nationwide is ahead of National Instruments (but it is a much larger and much older company, so that is to be expected), in that these metrics are tied to compensation, and there are systems in place to consistently apply the same basic framework across the enterprise.

This exercise kills more than one bird with a single slingshot load:

  • Performance measurement for myself and members of my team — the weights assigned are for the entire team; when it comes to individuals (myself included), it’s largely a matter of shifting the weights around; everyone on my team will have all of these objectives, but, in some cases, their role is really to just provide limited support for an objective that someone else is really owning and driving, so the weight of each objective will vary dramatically from person to person
  • Roles and responsibilities for team members — this is tightly related to the above, but is slightly different, in that the performance measurement and objectives are geared towards, “What do you need to achieve,” and it’s useful to think through “…and how are we going to do that?”
  • Alignment with partner groups — my team works closely with IT, as well as with a number of different business areas. This concise set of objectives is a great alignment tool, since achieving most my objectives requires collaboration with other groups; we need to check that their objectives are in line with ours. If they’re not, it’s better to have the discussion now rather than halfway through the coming year when “inexplicable” friction has developed between the teams because they don’t share priorities
  • Identifying the good and the bad — if done correctly (and, frankly, my team’s are AWESOME), then we’ll be able to check up on our progress fairly regularly throughout the year. At the end of 2009, it’s almost a given that we will have “Did not achieve” for some of our measures. By honing in on where we missed, we’ll be able to focus on why that was and how we can correct it going forward.

It’s a great exercise, and is probably the work that I did in this lull period that will have the impact the farthest into 2009.

I’ll let you know how things shake out!

Analysis

"The Axiom of Research" and "The Axiom of Action"

I attended a one-day seminar today on “The Role of Statistical Concepts and Methods in Research” taught by Dr. Tom Bishop of The Ohio State University. Dr. Bishop heads up a pretty cool collaboration between Nationwide (all areas of the company, including Nationwide: Car Insurance) and OSU, and this seminar was one of the types of minor artifacts of that collaboration.

Dr. Bishop had me on page 5 of the seminar materials when he introduced “The Fundamental Axioms of Research,” which he stated are twofold:

  • The Axiom of Variation — all research data used for inference and decision making are subject to uncertainty and variation
  • The Axiom of Action — in research, theory is developed, experiments are conducted, and data are collected and analyzed to generate knowledge to form a rational basis for action

The rest of the seminar was a healthy mix of theory and application, with all of the “theory” being tied directly to how it should be applied correctly. Dr. Bishop is a statistician by training with years of industry experience, so it was pretty cool to hear him emphasize again and again and again that you can get a lot of value from the data without running all sorts of complex, Greek-letter-rich, statistical analyses. The key is to apply a high degree of rigor in understanding and articulating the problem and the approach.

Lots of good stuff!

Presentation

Harvey Balls: A Good Way to Ramp Back Up

As you may have noticed, my blogging here over the past couple of months has been pretty sparse. That was largely because I was winding down one job, taking a “break” between jobs, and then ramping up in my new job. The first was mentally exhausting, the second was physically exhausting, and the third was back to mentally exhausting. Plus, I’ve gone from a company with 35 employees to a company with 38,000, and I wanted to get my feet wet and see if there were guidelines of any sort regarding this sort of blogging by employees. What I’ve found out is that they’re really embracing social media as it should be embraced, so I’m excited that I’ll get to start blogging on data management (specifically address management). But…that’s not this post!

I’ve been sitting in on some vendor selection meetings in my new role, and, today, I was part of a “summary and recommendations” presentation review that had a really nice use of “Harvey Balls” on the key summary slide. Harvey balls? These things, which you might be familiar with from Consumer Reports:

Wikipedia has a brief, yet interesting, history on the subject, including some practical tips for actually putting them to use.

I was struck by how effective they were at providing summary information. In today’s case, they were used to iconically represent the roll-up of ratings for groups of vendor requirements. It worked. Obviously, there is a loss of granularity here, but, in the situation today, we were looking at 8 different groups of requirements across three different alternatives, so we were still looking at 24 summary data points on a single, clean slide (there was also some subtle background shading that prioritized the groups of requirements, which almost worked…but we decided that sorting the groups from highest to lowest would probably do the trick as well).

There’s a part of me that sees a circle and thinks “pie chart” and cringes. But these really aren’t pie charts. Well…actually…they sorta’ are…but I’m still okay with them. Now, if someone started trying to use 100 different Harvey Balls where the granularity of each “wedge” was deeper than just a quadrant, I’d have a problem with it.

Overall, it’s one more tool for my (and your) data visualization toolkit!