Before we get into this topic, imagine you are asked to listen to a simple, short story and then asked to answer 4 questions on it. Consider, first, the story:
A little girl named Mary goes to the beach with her mother and brother. They drive there in a red car. At the beach they swim, eat some ice cream, play in the sand have sandwiches for lunch.
Now the questions (try to answer them without looking back at the story. Honor system. No cheating.):
1. What color was the car?
2. Did they have fish and chips for lunch?
3. Did they listen to music in the car?
4. Did they drink lemonade with lunch?
This little quiz was administered to British schoolchildren between the ages of 5 and 9 by academic researchers. Nearly all of the children answered the first two questions correctly (red and no, respectively) as it was merely a matter of retaining those details of the story. However, an overwhelming majority of the children who participated did not get the last two questions right.
Seventy-six percent (76%) of the children quizzed answered the last two questions in the same way many business leaders and politicians approach issues and problems: they bluffed through them.
The last two questions are unanswerable. The story does not discuss whether or not music was listened to in the car ride to the beach or what kind of beverage the family drank during lunch. To answer these questions with anything other than "I don't know" or "not enough information was presented to answer this question," shows a reluctance to admit that we don't know something.In their latest book, "Think Like A Freak," authors Steven D. Levitt and Stephen J. Dubner (The men behind the popular books
Freakanomics and SuperFreakanomics) propose that the 3 hardest words for most people to say are "I don't know." Think about it, how many times have you heard a politician, political commentator, or analyst freely admit that they do not know something about an issue they are clearly not an expert in?
Even when a person is not an expert on economics or foreign policy or environmental science, they will bluff their way through questions on jobs, minimum wage, Syria, Iraq, the Malaysia Airline flight that was shot down in the Ukraine, immigration, global warming, because they do not want to admit they do not know the intricacies of the issue or what a real solution might look like. They want to sound smart.
People end up relying on talking points that are either recycled or something they heard from someone else, without looking at the big picture.
The first step to overcoming a problem is admitting we have one, and the problem most people have is that they cannot say three simple words: "I don't know."
Why? The simple answer is because we are human and that is how we are wired. We don't want to admit not knowing something because we are afraid of being shamed either in public or by our peers. We stick with people who hold similar beliefs or just fall in line and run with the herd because it is just easier to do so, even when that means people are quick to embrace the status quo, slow to change their minds, and delegate actual thinking to someone else.
In many ways it can be considered a survival mechanism that has been hardwired into us. Just think about a friend who has made their political or philosophical opinions known in the past, but then they start dating someone and it seems like their opinions shift to various degrees in a matter of months. Many people likely know someone like that. It is just how we adapt.
In politics, people rely more on beliefs than they do facts because beliefs are harder to dispute than facts. The problem is people end up trying to convince others that they know more about something than they actually do.
Consider a question like, how do we stop mass shootings?
People can look at crime statistics and say that the federal ban on assault weapons reduced crime, therefore we need to reduce the availability and access to guns. Or, people can look at the reduction in crime in states that allow concealed carry and say this is evidence that we need to expand concealed carry rights and eliminate "gun free zones."
The only problem is that the people who make these arguments (in both cases) are only looking at one variable to confirm a belief they already have about guns in America. Frequent guest contributor and author Michael Austin has written extensively on confirmation bias -- something that occurs because we do not want to admit we don't know something or could be wrong out of whatever shame we think might come of it.In their latest book, Levitt and Dubner argue that we can never truly "know" what causes or solves an issue as big as gun violence because of all the variables that are involved, many of which are never considered by lawmakers and political talking heads. It is even harder, therefore, to predict how things will play out in the future because we only truly "know" the information we have now.
Over the years, researchers have shown that even top experts in their respective fields are about as reliable as a monkey throwing darts at a dartboard at making predictions -- a bit of an exaggeration, perhaps, but the point sticks.
But, people like to pretend they know more than they really do, and even experts in a given field like to arrogantly boast dogmatic beliefs about what is going to happen in the future because it makes them sound smart. People like to sound smart and pretend they know more than they do because it sounds better than admitting that they don't know what is going to happen in the next year, 5 years, or 10 years.
Just consider how many political analysts and so-called "experts" inaccurately predicted the results of the 2012 presidential election, even with all the data they used to support their prediction.
For politicians, the consequences of having to admit that they don't know something about a particular issue could be significant, so they often bluff their way through most topics using regurgitated talking points. People often choose their responses to a situation with their own reputation or interests in mind -- it is a common human trait.
People often don't know themselves well enough to be able to accurately assess what their skills and abilities are, which can be where the problem begins. Levitt and Dubner examine a classic poll asking people to rate their own driving abilities.
Most of the time, an overwhelming majority of people (typically around 80 percent) will rate their performance on the road as better than the average driver. This leaves approximately 20 percent of respondents who are willing to take an honest look at their own driving abilities.If we don't know ourselves, how can we pretend to know anything about the world around us?
Even if someone is the world's leading expert in a given domain, research has shown that this does not mean that person is more likely to excel in another domain, according to Levitt and Dubner. People can be great at something, but that does not mean they will be great at everything.
Levitt and Dubner write that this fact is most frequently ignored by the very people who make it their job or devote a considerable amount of their time engaging in ultracrepidarianism -- "the habit of giving opinions and advice on matters outside of one's knowledge or competence."
I like to call it the political equivalency of armchair quarterbacking.
For a politician, it is in their own self-interest to bluff their way through an issue or topic. Admitting ignorance on an issue could significantly damage one's chances at re-election, and lawmakers are often more concerned about getting re-elected than actually doing their jobs. The problem is, faking it can be extremely dangerous, especially if you are in charge of dictating policy.
"Just as a warm and moist environment is conducive to the spread of deadly bacteria," Levitt and Dubner write," the worlds of politics and business especially -- with their long time frames, complex outcomes, and murky cause and effect -- are conducive to the spread of half-cooked guesses posing as fact."
Going to war under false intelligence can cost the lives of thousands of American soldiers for reasons not completely understood by the public. Faking it on hydraulic fracturing (fracking) could have a serious impact on an entire industry, the environment, or public health. Taking a wild guess on economic policies can be a catalyst for future economic crises that would harm most people.
The people who make these guesses often times get away with it, too, even if they have a major negative impact on society. By the time everything plays out and people realize that these policymakers actually had no idea what they were talking about, the bluffers are gone or people have long since forgotten about the issue and are focused on whatever the latest scandal is.
Training one's self to think of the broader, bigger picture is easy to do, but it is something not many people are willing to do -- especially politicians and those who claim to be the most politically active. Still, unless we are willing to change the way we approach public discourse in the United States and are willing to admit when we don't know something, nothing is going to change. The policies that are adopted will either have an adverse effect or no effect at all.
After all, if we are not willing to admit what we don't know, it is impossible to learn what we do need to know.