Tag Archives: System 2

How to Win by Thinking Slower

Tuesdays With Coleman

It is fundamental human nature that when we see a correlation, we assume causation – in other words, because A happened, it must have been caused by B. In fact, we often invent causation when no concrete evidence exists. And this is a very dangerous exercise.

A perfect example of this in the radio industry is the interpretation of ratings data.

Let’s say the 3 PM – 7 PM numbers of the station you program took a precipitous dive last month. It’s like it came out of nowhere. You’ve got knots in your stomach. Your general manager is coming down the hall, and you know she’s seen the numbers. There is a logical side of your thinking which tells you that this is an anomaly. It is this part of your brain that knows there are myriad reasons why the numbers dipped. Could have been a change in the panel. A panelist’s family member may have fallen ill, taking him away from all that time in the car. Perhaps someone on the panel found a new podcast she liked. Or someone just got a connected car and has created some new Spotify playlists.

The possibilities, of course, are endless.

But in that emotional state in which you’ve just learned that the numbers took a nose dive, the logic is overwhelmed by emotion and you create a false causation.

“The music flow was wrong.” “The jocks talked too much.” “The competitor did direct mail.”

What you’ve done in this instance is so common, there’s a term for the two ways your brain processes information—System 1 and System 2.

Explained in great detail in his book, “Thinking, Fast and Slow”, Nobel Prize winning psychologist Daniel Kahneman explains how System 1 is our in-the-moment thinking. It’s the thought that comes to us “automatically and quickly, with little or no effort and no sense of voluntary control.”

Thinking Fast and Slow by Daniel Kahneman

System 2 is more complex and takes more effort, concentration and time.

When you think of the implications, you begin to realize the pitfalls and dangers of System 1. Because it strikes first, if we’re not consciously aware, our brains don’t even get the chance to process System 2.

In Chapter 10, “The Law of Small Numbers”, Kahneman offers a scenario. He gives an example of a study of new diagnoses of kidney cancer in the 3,141 counties of the United States that reveals a remarkable pattern. He says, “The counties in which the incidence of kidney cancer is lowest are mostly rural, sparsely populated, and located in traditionally Republican states in the Midwest, the South and the West.” He then asks, “What do you make of this?”

System 1 makes quick and easy conclusions.

You may infer, for example, that the low cancer rates in the rural population are due to clean living. No air pollution. No water pollution. Access to fresh food.

Now what if the scenario were reversed? What if you were told the highest incidence of kidney cancer are in counties that are mostly rural, sparsely populated, and located in traditionally Republican states in the Midwest, the South and the West?

System 1 has a rationalization for that too.

You might say the higher cancer rates are due to the poverty of the rural lifestyle, less access to medical care, too much alcohol and too much tobacco.

System 1 sees a correlation and creates the causation. The problem is, that should be System 2’s job.

System 1 and System 2 thinking

In this example, when your brain likely went to “rural” or “Republican” to find the rationale, you may have missed the most important part—“sparsely populated”.

As Kahneman points out, large samples are more precise than small samples.

Small samples yield extreme, wobbly, unreliable results more often than large samples do.

You already know this. Yet you likely still looked to find causation, a reason, an explanation, in the kidney cancer correlation example.

Why?

System 1.

Sometimes the power and influence of System 1 becomes so believable, it creates a widely accepted false narrative. This happened with misperceptions of the “hot hand” in basketball. You’ve seen players get “hot” during a game. Fans, coaches and players all buy in to the “hot hand” as fact. If a player sinks a few baskets in a row, they’ve now got a “hot hand”. When a player gets hot in the old NBA Jam video game, the basket catches on fire. Turns out there’s research on this. An analysis of thousands of sequences of shots concluded there is no such thing as a “hot hand”. While some players are more accurate, “the sequence of successes and missed shots satisfies all tests of randomness.”

This sample size of this research study was substantial, significantly large enough for reliability, yet we still believe in the hot hand.

System 1.

You can see how System 1 can lead to false assumptions and causation on ratings days. There are small sample issues. There are countless variables. Yet we still say, “This must have been why it happened.”

Thinking, Fast and Slow—or System 1 and System 2—may be the best argument for perceptual research for radio stations.

Ratings can tell you whether or not your radio station is sick or healthy. Even so, with the exception of rare, unique events (like the way hurricane coverage affects news station ratings or the local team in the Super Bowl affects sports station ratings), it’s generally best practice to look at a long enough trend that incorporates a large enough sample size and accounts for variables.

Ratings cannot tell you why your station is sick or healthy. And they cannot tell you the prescription for getting better or staying on top.

Consider a perceptual study as part of your strategic plan, which provides substantial sample size, a high level of accuracy and a reliable glimpse of market tastes and perceptions.

In a world often dominated by System 1 thinking, wouldn’t it be nice to have some System 2 strategy?