Top 3 Mistakes People Make when Visualizing Data

It's not what you see, but what it is

Henry David Thoreau isn’t known for visualizing big data. But he could have been. The 19th-century American transcendentalist, author, poet, philosopher, and naturalist understood what it takes to uncover a data story.

He understood what most dorky data scientists, savvy IT strategists, and almighty analysts don’t get today – when he said;

Capacity for clarity impacts the accuracy of seeing.

We see – What we bring, Who we are and What we want, Not what is

Bias creep distorts vision. Obscures reality. Muddies analysis. Confuses outcomes.

We’re so concerned with integrating data that we mistakenly integrate our biases right into our processes. Not good. Not effective. The integrity of the stories we tell will suffer.

Our biased agendas, unless dis-integrated and detached from our processes will pollute and prejudice perspective – which in turn will corrupt messaging, impede audience understanding and prevent key buy-in.

We see – Who we are, Not what is

Agendas corrupt procedure. Subjective listening distorts findings. Any story not truly heard is false. Detached, active listening brings genuine stories to life.

Biases corrupt integrity. We cannot deliver on single-minded promises of putting “customer focus first” when we cannot see beyond our predisposed assumptions.

Keeping data and perspective pure would be challenging enough if the data sets we work with were simple and straightforward. But when bias creep invades high data fluctuation real issues remain hidden rather than coming clearly to light.

We see – What we want to see, Not what is

At any point in the data visualization process, our story impacts the story. The story is either the client’s or ours. We allow the story to emerge. Or we impose upon it.

Even when we apply the best Design Thinking questioning (What is? What if? What wows? What works?) into the data visualization process, we have to guard against predisposed thinking, leading, creating and coloring responses.

Here are 12 questions we can ask ourselves to ensure we uncover user stories rather than our own.

  1. How is the current client state or condition expressing itself?
  2. Are the expressed issues real? How do you know that?
  3. What agendas are being expressed across those you’re interviewing?
  4. When questioning, “what is” what assumptions are keeping you from deeper uncovering?
  5. When asking “what if” are you allowing real think thru or directing thinking?
  6. When inquiring, “what wows” is POV neutral or are you headed in a predisposed direction?
  7. In your asking are you capturing opinion or experience?
  8. When probing, “what works?” do you have an answer already in mind or is your asking open enough to represent the intended audience experience?
  9. When asking are you listening to actual responses or leading where you think the real answers should lie?
  10. Does the information you’re gathering reflect the audience’s thinking about the experience or the actual experience?
  11. Whose story is emerging? Yours or theirs?
  12. What is your intent behind your method?

It’s no secret preconceptions distort reality. These three perspectives can help us get out of the way and uncover the story the data is telling.

  1. Gathering real requirements requires honest neutrality.
  2. Identifying foundational objectives demands objectivity.
  3. Open questioning allows for authentic answering.

A holistic approach to visualizing data in meaningful ways means our getting out of the way and letting what the story speak to us.

Please note: I reserve the right to delete comments that are offensive or off-topic.