Book Summary: The Intelligence Trap
This is a summary of some of the main points from the book
The Intelligence Trap by David Robson
(W. W. Norton & Company, 2019).
The main theme of the book is that even smart people make mistakes in logic. In fact, there are many kinds of logical errors that occur *more often* in well-educated people. Robson presents a number of useful ideas from psychology to support and explain his thesis.
Today, we blame conspiracy theories on ignorance. But in a way, the problem of “smart people” getting things wrong is a more serious issue than general scientific ignorance.
Some people with high IQs seem to latch onto bizarre theories, and their intelligence combined with modern social media help them spread misinformation to the masses.
The book is a well-written explanation of many important ideas from psychology and human behavior. We like to think that “ignorance” is the biggest single threat to modern society – but it might be “people who fail to use their intelligence in a logical way.”
The principal problem Robson identifies: intelligence doesn’t equal rational thinking or wisdom. Robson draws on current research in psychology to point out four main issues:
- Intelligence (as measured by IQ tests) doesn’t equip you to solve all problems. For some problems, you need experience (and “tacit knowledge” about the problem area). And sometimes in order to analyze a problem properly, you need skills in “counterfactual thinking” (using your imagination: setting up a plan in your mind so you can think through the potential consequences of your actions).
- Blind spots and “cognitive bias” can cause us to make mistakes – and perpetuate our flawed reasoning.
- Experts often fall into a special trap: too much confidence in our own judgment = “earned dogmatism.” We worked so hard to get where we are... so we don’t believe we can be wrong. (Medical doctors want to give a definitive diagnosis, even in difficult cases where there are contradictory symptoms.)
- Experts do not always use their expertise effectively. Experts may make errors because of laziness, carelessness, overwork, or arrogance. Experts often use “entrenched, automative behaviors” without careful analysis, which sometimes leads to disastrous errors.
Throughout the book, Robson tries to explain why even “intelligent” people believe weird things.
He concludes that even if we are very intelligent, we don’t always use our intelligence well. We might be fooled by complex or confusing data, we might make logic mistakes, or we might be lazy and not think things through.
Robson offers a number of techniques for avoiding various forms of the intelligence trap.
Techniques for avoiding the intelligence trap
- “evidence-based wisdom” = seek and absorb information that contradicts your initial view (consider the perspectives of other people), counterfactual thinking (recognize the ways in which things might unfold), recognize the likelihood of change, search for compromise, intellectual humility (awareness of the limits of our knowledge and uncertainty) [from Igor Grossman - Univ. of Waterloo]
- try to understand your own bias blind spots (where you have been fooled before)
- self-distancing (describe yourself and your problems in the third person) [from Ethan Kross - Univ. of Michigan]
- when you make an estimate, also include an estimate of your confidence in the estimate [from the Good Judgement Project, Philip Tetlock – book “Superforecasting”]
- develop “reflective skills” (improve your ability to reflect on your current feelings, and develop a richer vocabulary about emotions) == “interoception” (the ability to gauge your own emotions, using the knowledge to understand your own intuition and decision making)
Robson also gives some good advice about how we can improve our “reflective skills”
- mindfulness mediation (a single 15-minute mindfulness session can reduce the incidence of sunk cost bias by 34%; mindfulness reduces myside biases, people are more receptive to criticism)
- musicians and dancers have been found to have more fine-tuned interoception
- training sessions to teach people how to describe their own emotions
- incorporate “reflective steps” into a checklist... a process where you write down your initial “expert” evaluation, but then reflect to look for biases
Fake news and conspiracy theories
Robson also has a long chapter on fake news, conspiracy theories, how cognitive biases contribute to believing lies, and how difficult it can be to “debunk” false beliefs.
Robson gives the following arguments about our failure to stop fake news:
- Don’t “debunk myths” by repeating the myth first and then explaining the facts. This actually reinforces the lies for many people – by putting too much emphasis on the misinformation.
- The main reason: most lies and conspiracy theories are designed as a “fluent story” – easy to remember, difficult to dislodge. Many attempts to debunk are just too awkward.
- Better to lead off with “the truth” in a simple and direct form: “Flu vaccines are safe and effective.” (Avoid repeating the myth entirely, focus on the scientifically proven, positive benefits. Most organizations are too “earnest” in presenting the facts, so they overcomplicate the argument, reducing the fluency of the argument. Present your facts selectively... two facts can be more powerful than ten.)
[I remember using this myth-debunking technique in the 2020 election campaign. There were certain candidates that were attacking mail-in voting as a potential source of fraud. When I worked for a local candidate’s phone bank, our script included answers to questions about mail-in voting. Our standard first line was: “Mail-in voting in New Jersey is safe and secure.” (A simple and fluent message, as fluent as the fraudulent fake news.)]
Robson talks about the work of Gordon Pennycook [Univ. of Waterloo] on decreasing your own “bullshit receptivity” – you need to improve your ability to resist inaccurate or fraudulent data.
- Pennycook ran studies on “nonsense pseudo-profound philosophy statements” such as “Hidden meaning transforms unparalleled abstract beauty.” People with an analytical mindset were in a better position to call them nonsense. Pennycook also studied the reaction to news headlines (factual stories and fake news) – people with greater cognitive reflection were better at telling the difference.
Robson points to some attempts to “inoculate” people against bullshit... making them better equipped to spot other forms in the future. The idea is to set up red flags in our minds, warning signs that trigger our analytical thinking when we need it.
- There are studies done by John Cook and Stephan Lewandowsky – In their study, they were studying the ability to debunk false stories about climate change. Before showing participants misinformation about climate change, they had an initial discussion about the tobacco industry’s efforts to use “fake experts” in the 1970s to stop tobacco regulation. This warmup helped the participants identify fake news manipulation on climate change.
- Robson also suggests that “critical thinking classes” are useful, but the training should be about more than just philosophy. The training is more effective if the focus is on real-life examples of how people try to fool us. (This makes me think of the classic book How to Lie With Statistics by Darrell Huff... one of my all-time favorites because it has so many examples of bogus statistics – it is good training for reading advertisements.)
Other topics
The remaining sections of Robson’s book explore two other areas:
techniques and principles that can help people use their minds better and
the common properties of successful and unsuccessful teams.
One section explains the elements of Carol Dweck’s theory of “growth mind-set.”
- The idea is that some people believe that their talents are innate and unchanging (fixed mind-set), whereas the people who believe in challenging themselves and expanding their talents tend to do better in the world.
Intelligent people often limit themselves: their fixed mind-set stops them from trying to learn new things because they might fail.
Robson notes that growth mind-set is good to have... it helps us learn more effectively.
- Robson believes that growth mind-set also helps us avoid traps like “motivated reasoning,” because we are less likely to fall for “dogmatic, one-sided reasoning.”
If we are confident that we can continue to “learn” productively, we are more likely to “reason” wisely.
- Robson concludes with some tips for improving memory and our ability to study and retain facts.
Finally, Robson has a short section on effective and not-so-effective teams. It is the weakest section of the book: many glorious examples of overachieving sports teams and horror stories of clueless management teams.
But there are a few important ideas hidden in Robson’s examples:
- The notion of “collective intelligence” (an idea from Anna Williams Woolley at Carnegie-Mellon University), which assesses the ability of a group to generate new ideas, choose potential solutions, negotiate with other team members, and manage the execution of key tasks. In good teams, the team members collaborate well and seem to be better at reading the emotions of their teammates. In bad teams, teamwork seems to be pulled down by internal team conflict or leaders silencing other members of the team.
- Some leaders seem to be most successful by focusing on helping others to succeed. Good leaders and coaches may see themselves as a “servant” to the team. The humility of the leader can push everyone to work a little harder to support their teammates.
- Some team failures can be explained by “functional stupidity” (Mats Alvesson and Andre Spicer - book “The Stupidity Paradox”).
It is a special intelligence trap, where team members have an incentive *not* to think, to just “go with the flow” rather than challenge management assumptions or possible negative consequences in the future.
In a competitive environment, team members may choose to play along with a leader’s errors to get a promotion. Team members might fight to defend a narrow choice benefitting one department over the good of the company as a whole. And of course, there are always team members who choose to bring only *good news* to the boss.
-
There are some ways to build more collective intelligence: all of the related to communications and sharing.
Of course, decision reviews, post-mortems, and group retrospectives are a good way to promote learning and thinking.
Also, inviting staff members in from other organizations to shadow your staff and ask questions can help overcome biases and blind spots.
Other approches can help as well: training in critical thinking, improving the “group-level” growth mind-set, and assessing the amount of time pressure in important decisions.