General

Four Books That Will Change the Way You Communicate

I don’t think I will ever forget the first time that I made a presentation at work. It was just over a decade ago, I was just a few months into my employment at a company where I would work for the next eight years, and I was on the hook to present a new process to a room of 20 engineers. I diligently prepared my transparencies (I’m old enough to have used an overhead projector, but not old enough to refer to the medium they supported as “foils”). I rehearsed the material again and again.

And I bombed.

The material was dry as it was, but it wasn’t, by any means, unmanageable content. I just didn’t do a good job of managing it!

Fast forward 10 years, and I found myself giving a presentation to a room of 50-60 people, and the material was set up to be just as naturally engaging — presenting on an approach to measurement and analytics to…a bunch of marketers.

The presentation went much better, judging both from the engagement level of the audience and discussions that it has prompted weeks later. I’m no Steve Jobs, but I’ve paid attention to what seems to work and what doesn’t (both in my presentations and others), read some articles here and there, and, I realized, read a few books along the way that have really helped.

So, with that — four books that all have a heavy component of “how the brain works” and that, collectively, have taught me a lot about how to present information, be it a dashboard, a report, or a presentation.

Gladwell and Gilbert

The first two books are books that I read within a few months of each other. To this day, I recall specific anecdotes with no idea which book they came from. Blink: The Power of Thinking Without Thinking made the rounds when it first came out as “another great book by Malcolm Gladwell” (following The Tipping Point: How Little Things Can Make a Big Difference). The fundamental anecdote of Blink has to do with our “adaptive unconscious” — our intuition and ability to “know” things without fully needing to process them. As he dives into example after example, Gladwell touched on various aspects of how the brain works.

Daniel Gilbert’s Stumbling on Happiness takes a more directly psychological angle, but it covers some of the same territory. One of Gilbert’s main points is that the human brain does not remember things like we think it does — pointing out that a vividly remembered, down-to-the-color-of-the-shirt-you-were-wearing memory is not really an as-recorded memory at all. Rather, the brain remembers a few specific details and then makes up / fills in the rest when the memory gets called up. It’s so good at filling in these blanks that it fools itself into not being able to tell fact from interpolation!

Both of these books made an impact on me, because they pointed out that how we take in, process, and store information doesn’t work at all like we intuitively think it does. And, both books set up the next two books by shaking the assumptional foundations I had of how we, as humans, think.

Straight-Up Business Reading

Chip and Dan Heath’s Made to Stick: Why Some Ideas Survive and Others Die is a practical manual for communicating information that you want your audience to pay attention to and retain. They boil the components into a five-letter acronym — S.U.C.C.E.S. — and go into each component in detail.

The elements are Simple, Unexpected, Concrete, Credible, Emotional, and Stories, and they provide a nice framework for critiquing how we communicate any idea. Irecognize that I regularly struggle with Simple, Concrete, and Stories as elements in my blog posts. But, every element is one that can be injected using some discipline and time to do so. I nailed all three of these elements a number of years ago when I found myself on an internal lecture circuit trying to drum up large donors for my company’s annual United Way campaign — I was heavily vested in conveying a strong message, and I wound up using an example of my grandfather’s battle with Alzheimer’s as a way to pull the audience in and ask them to find something they were passionate about and support it. I also wove in various quirky takes on how $10/week would really add up — think the sort of thing you hear again and again from your local NPR station during fundraising drives. In the case of that campaign, we blew our numbers out of the water — had a 500% increase in the number of people who gave at the “leadership level” that year. Now, a lot of things had to come together to make that happen, but, to this day, I’m sure my well-crafted, well-rehearsed, and sincere speech made to at least a dozen different groups of employees (and the fact that I was a fairly low-level employee making the case — I was asking people who were making a lot more money than I was to give at least as much as I was), played a non-trivial role.

And that was years before I read Made to Stick. But, the book helped me reflect on any number of presentations — ones that worked and ones that didn’t.

And, Finally, Wisdom from a Neuroscientist


The last book in this tetralogy is one that I just finished reading — Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School, by John Medina. I stumbled across the book as a recommendation from Garr Reynolds of Presentation Zen, so I wasn’t surprised that it had some very practical tips, as well as the “why?” behind them, for communicating effectively. Medina’s premise is that there’s a ton of stuff we don’t yet understand about the brain. BUT, there are also a lot of things we do know about the brain, and many of those lay out pretty clearly that the way we work in business and the way our education system is set up both run counter to how the brain naturally functions.

These “things we do know” are broken down into 12 “rules” — exercise (good for the brain), survival (why and how the brain evolved…and implications), wiring (how the brain works at a highly micro level), attention (there’s NO SUCH THING as multitasking…and other goodies), short-term memory (what makes it there and how), long-term memory (what makes it there, how, and how long it takes to get there), sleep (good for the brain), stress (some kinds are good, some kinds are bad), sensory integration (the more senses involved, the better the memory), vision (the #1 sense), gender (men are from Mars…), exploration (age doesn’t really degrade our ability to learn). Medina ends each chapter (one rule per chapter) with “Ideas” — implications for the real world based on the information presented.

The book goes into very technical detail about how, when, and where electrical charges zip around in our skulls to accomplish different tasks. While that information is not directly applicable, each time he goes there it’s as a setup to more directly useful information. Throughout the book, Medina provides practical thoughts for how to communicate more effectively — helping people pay attention (getting the information you are communicating into working memory) and retain the information over both the short and the long term. Two of my absolute favorite nuggets from the book were:

  • p. 130 (in the chapter on long-term memory) — Medina has the reader do a little memory exercise with the following characters: “3 $ 8 ? A % 9.” The fact he drops after the exercise is interesting: “The human brain can hold about seven pieces of information for less than 30 seconds! If something does not happen in that short stretch of time, the information becomes lost.” This is about getting information on its way from working memory to long-term memory and how repetition, thinking about the information, and talking about the information all helps it on its way. As a communicator (be it through a presentation or through a dashboard of data), this seems like powerful stuff — how often have we all seen someone cut loose with slide after slide of mind-numbing information? The human brain simply cannot take all of that in and retain it without some help!
  • p. 239 (in the chapter on vision) — Medina has a section titled “Toss your PowerPoint presentations.” I groaned. While I get highly annoyed by the rampant misuse of PowerPoint, I’m not a Tufte acolyte to the point that I see the tool itself as evil. In the second paragraph, though, Medina clarifies by providing a two-step prescription: 1) burn your current presentations, and 2) make new ones. Medina’s beef with PowerPoint is that the default slide template is text-based with a six-level hierarchy. This entire chapter is about how a picture really is worth 1,000 words, and Medina pleads with the reader to cut wayyy back on the text in his/her presentations (he has a fascinating explanation of how, when we read, we’re really interpreting each letter as a small picture…and that’s actually not a good thing for retention of information).

There are oodles of other good information in the book, but these are two of the snippets that really resonated with me.

Better to Be Steve Jobs than Bill Gates

I do believe that some people have better communication instincts than others. I’ll never be Steve Jobs when it comes to holding an auditorium in the palm of my hand. But, between reading these books and thinking through my own evolution as a communicator (this blog notwithstanding…but I’ve always said that I write this blog to keep my e-mails shorter and to try out ideas that occur to me during the day — sorry folks…both of you…but this blog is mostly for me!), I’m convinced that effective communication is a trainable skill.

I’ve also noticed that, the more I have to communicate, and the more I work to do so effectively, the easier it seems to be getting. In another 20 years, I might just have it nailed!

Analysis, Analytics Strategy, Social Media

The Spectrum of Data Sources for Marketers Is Wide (& Overwhelming)

I’ve been using an anecdote of late that Malcolm Gladwell supposedly related at a SAS user conference earlier this year: over the last 30 years, the challenge we face when it comes to using data to drive actions has fundamentally shifted from a challenge of “getting the right data” to “looking at an overwhelming array of data in the right way.” To illustrate, he compared Watergate to Enron — in the former case, the challenge for Woodward and Bernstein was uncovering a relatively small bit of information that, once revealed, led to immediate insight and swift action. In the latter case, the data to show that Enron had built a house of cards was publicly available, but there was so much data that actually figuring out how to extract the underlying chicanery without knowing exactly where to look for it was next to impossible.

With that in mind, I started thinking about all of the sources of data that marketers now have available to them to drive their decisions. The challenge is that almost all of the data sources out there are good tools — while they all claim competitive advantage and differentiation from other options…I believe in the free markets to the extent that truly bad tools don’t survive (do a Google search for “SPSS Netgenesis” and the first link returned is a 404 page — the prosecution rests!). To avoid getting caught up in the shiny baubles of any given tool, it seems worth organizing the range of available data some way — put every source into a discrete bucket.  It turns out that that’s a pretty tricky thing to do, but one approach would be to put each data source available to us somewhere on a broad spectrum. At one end of the spectrum is data from secondary research — data that someone else has gone out and gathered about an industry, a set of consumers, a trend, or something else. At the other end of the spectrum is the data we collect on our customers in the course of conducting some sort of transaction with them — when someone buys a widget from our web site, we know their name, how they paid, what they bought, and when they bought it!

For poops and giggles, why not try to fill in that spectrum? Starting from the secondary research end, here we go…!

Secondary Research (and Journalism…even Journalism 2.0)

This category has an unlistable number of examples. From analyst firms like Forrester Research and Gartner Group, to trade associations like the AMA or The ARF, to straight-up journalists and trade publications, and even to bloggers. Specialty news aggregators like alltop.com fall into this category as well (even if, technically, they would fit better into a “tertiary research” category, I’m going to just leave them here!).

I stumbled across iconoculture last week as one interesting company that falls in this category…although things immediately start to get a little messy, because they’ve got some level of primary research as well as some tracking/listening aspects of their offer.

Listening/Collecting

Moving along our spectrum of data sources, we get to an area that is positively exploding. These are tools that are almost always built on top of a robust database, because what they do is try to gather and organize what people — consumers — are doing/saying online. As a data source, these are still inherently “secondary” — they’re “what’s happening” and “what’s out there.” But, as our world becomes increasingly digital, this is a powerful source of information.

One group of tools here are sites like compete.com, Alexa, and even Google’s various “insights” tools: Google Trends, Google Trends for Websites, and Google Insights for Search. These tools tend to not be so much consumer-focussed as site-focussed, but they’re getting their data by collecting what consumers are doing. And they are darn handy.

“Online listening platforms” are a newer beast, and there seems to be a new player in the space every day. The Forrester Wave report by Suresh Vittal in Q1 2009 seems like it is at least five years old. An incomplete list of companies/tools offering such platforms includes (in no particular order…except Nielsen is first because they’re the source of the registration-free PDF of the Forrester Wave report I just mentioned):

And the list goes on and on and on… (see Marshall Sponder’s post: 26 Tools for Social Media Monitoring). Each of these tools differentiates itself from their competition in some way, but none of them have truly emerged as a  sustained frontrunner.

Web Analytics

I put web analytics next on the spectrum, but recognize that these tools have an internal spectrum all their own. From the “listening/collecting” side of the spectrum, web analytics tools simply “watch” activity on your web site — how many people went where and what they did when they got there. Moving towards the “1:1 transactions” end of the spectrum, web analytics tools collect data on specifically identifiable visitors to your site and provide that user-level specificity for analysis and action.

Google Analytics pretty much resides at the “watching” end of this list, as does Yahoo! Web Analytics (formerly IndexTools). But, then again, they’re free, and there’s a lot of power in effectively watching activity on your site, so that’s not a knock against them. The other major players — Omniture Sitecatalyst, Webtrends, Coremetrics, and the like — have more robust capabilities and can cover the full range of this mini-spectrum. They all are becoming increasingly open and more able to be integrated with other systems, be that with back-end CRM or marketing automation systems, or be that with the listening/collecting tools described in the prior section.

The list above covered “traditional web analytics,” but that field is expanding. A/B and multivariate testing tools fall into this category, as they “watch” with a very specific set of options for optimizing a specific aspect of the site. Optimost, Omniture Test&Target, and Google Website Optimizer all fall into this subcategory.

And, entire companies have popped up to fill specific niches with which traditional web analytics tools have struggled. My favorite example there is Clearsaleing, which uses technology very similar to all of the web analytics tools to capture data, but whose tools are built specifically to provide a meaningful view into campaign performance across multiple touchpoints and multiple channels. The niche their tool fills is improved “attribution management” — there’s even been a Forrester Wave devoted entirely to tools that try to do that (registration required to download the report from Clearsaleing’s site).

Primary Research

At this point on the spectrum, we’re talking about tools and techniques for collecting very specific data from consumers — going in with a set of questions that you are trying to get answered. Focus groups, phone surveys, and usability testing all fall in this area, as well as a plethora of online survey tools. Specifically, there are online survey tools designed to work with your web site — Foresee Results and iPerceptions 4Q are two that are solid for different reasons, but the list of tools in that space outnumbers even the list of online listening platforms.

The challenge with primary research is that you have to make the user aware that you are collecting information for the purpose of research and analysis. That drops a fly in the data ointment, because it is very easy to bias that data by not constructing the questions and the environment correctly. Even with a poorly designed survey, you will collect some powerful data — the problem is that the data may be misleading!

Transaction Data

Beyond even primary research is the terminus of the spectrum — it’s customer data that you collect every day as a byproduct of running your business and interacting with customers. Whenever a customer interacts with your call center or makes a purchase on your web site, they are generating data as an artifact. When you send an e-mail to your database, you’ve generated data as to whom you sent the message…and many e-mail tools also track who opened and clicked through on the e-mail. This data can be very useful, but, to be useful, it needs to be captured, cleansed, and stored in a way that sets it up for useful analysis. There’s an entire industry built around customer data management, and most of what the tools and processes in that industry focus on is transaction data.

What’s Missing?

As much as I would like to wrap up this post by congratulating myself on providing an all-encompassing framework…I can’t. While there are a lot of specific tools/niches that I haven’t listed here that I could fit somewhere on the spectrum of tools as I’ve described it, there are also sources of valuable data that don’t fit in this framework. One type that jumps out to me is marketing mix-type data and tools (think Analytic Partners, ThinkVine, or MarketShare Partners). I’m sure there are many other types. Nevertheless, it seems like a worthwhile framework to have when it comes to building up a portfolio of data sources. Are you getting data from across the entire spectrum (there are free or near-free tools at every point on the spectrum)? Are you getting redundant data?

What do you think? Is it possible to organize “all data sources for marketers” in a meaningful way? Is there value in doing so?