I’m just finishing Chip and Dan Heath’s new book, Switch, which I bought right away because I thought Made to Stick was one of the most actionable books I’ve ever seen about effective communications. A few years after reading Made to Stick, I still recommend it far and wide.
Made to Stick is one of those books whose entire arc grabs you and informs your thinking from that day forward. I’m not sure Stick hits that bar, but it has a lot of good stuff in it, and I found at least one gem that I think may be the most important reminder I’ve gotten in a while about how to make sense of the world. It’s called fundamental attribution error, which describes our tendency consistently to reach incorrect conclusions about the meaning of the data / information we’re getting.
A story to explain the concept: a group of schoolkids that was struggling academically was divided into two groups for a few hours of once-a-week instruction over a six week period. One group received normal tutoring and traditional instruction in the subject areas in which they were struggling; another group was exposed to a curriculum that focused on the mind as a pliable “muscle” that could be strengthened through hard work and study, and on the concept that intelligence wasn’t fixed. Group 1 was taught stuff, group 2 was taught that they could be taught stuff.
Put another way: group 2 was taught that the data they’d received about their academic performance (that they didn’t do well in school) did not mean that they were irrevocably poor students. And guess what? Test scores from this second group – after just a few hours of these sessions spread out over a little more than a month – soon beat the pants off those of group 1.
I think of fundamental attribution error as the story we build around the data we are given. In the simple, obvious case: I push the elevator button a bunch of times, and the elevator comes. The attribution error is thinking that the more I push the button, the more quickly the elevator comes.
Elevator-pushing is a blunt example. But life is full of more subtle, trickier situations. Say I’m meeting someone for the first time and she seems distracted and disengaged. If I’m nervous about the conversation, my attitudes about the information I’m receiving (she’s not looking me in the eye, she seems distracted) informs the story I tell myself about her actions (which is by definition not the same thing as her actions). This story can be objective and positive (“I’m noticing that she’s distracted.”) or it can morph into a stream of fear-induced thoughts: “she doesn’t like me” or “she’s not interested in what I’m saying.” More often than not where you end up will be somewhere between these two extremes.
Chip and Dan Heath remind us in Switch that humans are typically awful at distinguishing real from imagined patterns; that we over-attribute actions to people’s personalities and attitudes rather than to situations; and that the biases we bring into situations play an overwhelming role in how we process the information we receive.
Our outlook – about ourselves, our own self-image, how we think people act and process information – is the superstructure upon which we hang the tidbits and facts that accumulate throughout our days. They allow us to make sense of the complexity of the world around us, but they also can reinforce patterns that are completely of our own making. This is why acting fearless, positive, open and present is just about the most powerful strategy for having positive interactions and learning appropriately from things you hear from others.
I’m beginning to believe that HOW we make sense of information – which more often than not is about us, our attitudes and biases and fears – is the most fundamental determinant of our experiences of and success in the world. Put another way, recognizing where we consistently reach wrong conclusions is the first, giant step towards breaking out of these patterns.
The data – how people act and react; what they think of us; what they hope to achieve themselves – is usually quite benign, and the signal (real information) to noise (our story about the data) ratio is often pretty low.
Separating out what we see from what we tell ourselves allows us a glimpse of real, honest truths.