The Curse of Knowledge: A Difficulty in Understanding Less-Informed Perspectives

The Curse of Knowledge

 

The curse of knowledge is a cognitive bias that causes people to fail to properly understand the perspective of those who do not have as much information as them.

For example, the curse of knowledge can mean that an expert in some field might struggle to teach beginners, because the expert intuitively assumes that things that are obvious to them are also obvious to the beginners, even though that’s not the case.

Because the curse of knowledge can cause issues in various areas of life, such as when it comes to communicating with others, it’s important to understand it. As such, in the following article you will learn more about the curse of knowledge, understand why people experience it, and see how you can account for its influence.

 

Examples of the curse of knowledge

The following are examples of common ways in which the curse of knowledge can influence people:

  • The curse of knowledge can make it harder for experts to teach beginners. For example, a math professor might find it difficult to teach first-year math students, because it’s hard for the professor to account for the fact that they know much more about the topic than the students. This aspect of the curse of knowledge, whereby experts struggle to teach in a way that beginners can understand, is sometimes referred to as the curse of expertise.
  • The curse of knowledge can make it harder for people to communicate. For example, it can be difficult for a scientist to discuss their work with laypeople, because the scientist might struggle to remember that those people aren’t familiar with the terminology in the scientist’s field.
  • The curse of knowledge can make it harder for people to predict the behavior of others. For example, an experienced driver may be surprised by something dangerous that a new driver does, because the experienced driver struggles to understand that the new driver doesn’t understand the danger of what they’re doing. This aspect of the curse of knowledge is associated with people’s expectation that those who are less informed than them will use information that the less-informed individuals don’t actually have.
  • The curse of knowledge can make it harder for people to understand their own past behavior. For example, it can cause someone to think that they were foolish for making a certain decision in the past, even though the information that they had at the time actually strongly supported that decision. This aspect of the curse of knowledge can manifest in various ways and be referred to using various terms, such as the hindsight bias, knew-it-all along effect, and creeping determinism.

Note: when it comes to the curse of knowledge, the perspective of the less-informed individual, whether it’s a different person or one’s past self, is often referred to as a naive perspective.

 

The tapping study

One well-known example of the curse of knowledge is the tapping study. In this study, participants were randomly assigned to be either a tapper or a listener. Each tapper finger-tapped three tunes (which were selected from a list of 25 well-known songs) on a desk, and was then asked to estimate the probability that the listener will be able to successfully identify the song that they tapped, based only on the finger tapping.

On average, tappers estimated that listeners will be able to correctly identify the tunes that they tapped in about 50% of cases, with estimates ranging anywhere from 10% to 95%. However, in reality, listeners were able to successfully identify the tune based on the finger tapping in only 2.5% of cases, which is far below even the most pessimistic estimate provided by a tapper, and which therefore represents evidence of the curse of knowledge.

Furthermore, when tappers listened to the experimenter tap out the tunes that they had chosen, they still estimated that around 50% of people would be able to guess the tune that was being tapped by the experimenter, since they themselves had an easy time identifying those tunes. This was the case even when those original tappers were replaced by new people, who never served as either a tapper or as a listener.

Overall, the tapping study demonstrates how the curse of knowledge can affect people’s judgment. Specifically, it shows that people who know which tune is being tapped have an easy time identifying it, and therefore struggle to accurately predict the perspective of others, who don’t have the same knowledge that they do.

Note: the tapping study was published in the 1990 doctoral dissertation of Stanford student Elizabeth Louise Newton. The dissertation is often referenced using the title “Overconfidence in the Communication of Intent: Heard and Unheard Melodies”. However, the dissertation’s original title appears to have been “The Rocky Road from Actions to Intentions”, and this discrepancy may have occurred as a result of how the dissertation was referenced by Newton’s supervisor, Professor Lee Ross, in a 1991 book chapter.

 

The psychology and causes of the curse of knowledge

The curse of knowledge is attributed to two main cognitive mechanisms:

  • Inhibitory control, which represents people’s inability to fully ignore information that they have when they try to reason about other perspectives.
  • Fluency misattribution, which represents people’s tendency to overestimate the likelihood that information that they know and have processed will also be known and processed by others. This means, for example, that people assume that information that they know is more commonly known by others than is actually the case, or is easier for others to infer or foresee than it really is. This can be attributed to people’s tendency to assume that their subjective fluency when it comes to certain information represents that information’s objective fluency.

People’s curse of knowledge can be caused by either of these mechanisms, and both mechanisms may play a role at the same time.

Furthermore, other cognitive mechanisms may also lead to the curse of knowledge. For example, one such mechanism is anchoring and adjustment, which in this case means that when people try to reason about a less-informed perspective, their starting point is often their own perspective, which they struggle to adjust from properly.

All these mechanisms, in turn, can be attributed to various causes, such as the brain’s focus on acquiring and using information, rather than on inhibiting it, which is beneficial in most cases but problematic in others.

In addition, various factors, such as age and cultural background, can influence people’s tendency to display the curse of knowledge, as well as the way and degree to which they display it.

Finally, other psychological concepts are associated with the curse of knowledge. The most notable of these is theory of mind, which is the ability to understand that other people have perceptions, thoughts, emotions, beliefs, desires, and intentions that are different from your own, and that these things can influence people’s behavior. Insufficient theory of mind can therefore lead to an increase in the curse of knowledge, and conversely, proper theory of mind can reduce the curse of knowledge.

Note: the term the “curse of knowledge” was coined in a 1989 paper by researchers Colin Camerer, George Loewenstein, and Martin Weber. This phenomenon is sometimes also conceptualized as epistemic egocentrism, though some theoretical distinctions may be drawn between these concepts.

 

How to deal with the curse of knowledge

There are several things that you can do to reduce the curse of knowledge:

  • Increase your understanding of the bias and its impact. Specifically, understand what the curse of knowledge is, why it happens, how it affects people, and when and where it’s likely to affect people in general, and you in particular.
  • Maintain awareness of the bias and its impact. For example, in situations where you’re likely to be influenced by the curse of knowledge, remind yourself that others might not know everything that you do, and that this is important to keep that in mind when considering their perspective.
  • Identify the perspectives involved and the differences between them. For example, you can ask yourself what common knowledge you share with others, and what unique knowledge you have that others don’t.
  • Clearly identify what knowledge is needed. For example, if you’re angry at someone for not apologizing after acting in a way that hurt you, even though they might not even know that they hurt you, ask yourself what information they would need to have in order to know that they hurt you, and consider whether they actually have that information.
  • Get feedback from relevant individuals. For example, if you’re a teacher, you can ask students if they understood what you said, or ask questions that check their understanding. Similarly, if you’re talking to someone about a complex topic for the first time, you can start by asking them what they know, so you can properly gauge their level of understanding.
  • When in doubt, assume lack of knowledge. For example, if you’re a researcher who’s giving a talk to people who aren’t experts in your field, assume from the start that they won’t be familiar with highly technical terms, and make sure to either avoid those terms or explain them explicitly. This can be particularly useful in cases where you can’t get feedback from your audience, such as when the communication is primarily one-sided, so you can’t get much feedback while communicating. Furthermore, to facilitate communication and positive relationships with others, you should not only avoid taking it for granted that people know things if there are any doubts as to whether that is the case, but you should also avoid implying that they know these things. For example, you shouldn’t say things such as “obviously”, “clearly”, or “of course”, unless you’re absolutely sure that they’re appropriate, and even then it can be better to avoid them.
  • Use other debiasing techniques. For example, you can use various general debiasing techniques, such as slowing down your reasoning process and improving your decision-making environment. In addition, you can use debiasing techniques that are meant to reduce egocentric biases, such as visualizing the perspective of others and then adjusting your judgment based on this, or using self-distancing language (e.g., by asking “are you teaching in a way that the students can understand?” instead of “am I teaching in a way that the students can understand?”).

You can also use these techniques to help other people reduce their curse of knowledge in various ways. For example, you teach people directly about the curse of knowledge and about these techniques for reducing it, or you can help them implement these techniques indirectly, for instance by encouraging them to visualize the perspective of others.

However, keep in mind that none of these techniques are guaranteed to work perfectly in every situation. This means, for example, that some techniques might not work for some individuals in some circumstances, or that even if a certain technique does work, it will only reduce someone’s curse of knowledge to some degree, but won’t eliminate it entirely.

Finally, note that accounting for the curse of knowledge can be beneficial when it helps you understand and predict people’s behavior, including your own, even in situations where you don’t reduce this bias. For example, understanding the curse of knowledge can help you select instructors more effectively, by realizing that one instructor having more expertise than another could actually make the expert instructor worse at teaching.

Overall, you can reduce the curse of knowledge in various ways, including increasing your understanding and awareness of this bias, identifying the perspectives involved and the differences between them, getting feedback from relevant individuals, and using general debiasing techniques, such as slowing down the reasoning process. You can also use these techniques to reduce other people’s curse of knowledge, and you can benefit from accounting for this bias even if you don’t reduce it, such as when predicting people’s teaching ability.

 

Related biases

The curse of knowledge is considered to be a type of egocentric bias, since it causes people to rely too heavily on their own point of view when they try to see things from other people’s perspective. However, an important feature of the curse of knowledge, which differentiates it from some other egocentric biases, is that it is asymmetric, in the sense that it influences those who attempt to understand a less-informed perspective, but not those who attempt to understand a more-informed perspective.

The curse of knowledge is also associated with various other cognitive biases, such as:

  • The illusion of transparency, which causes people to overestimate the degree to which their thoughts and emotions are apparent to others.
  • The empathy gap, which causes people to struggle to understand mental states that are different from their present state, or to struggle to consider how such states affect people’s judgment and decision-making.
  • The false consensus effect, which causes people to overestimate the degree to which their beliefs, values, characteristics, and behaviors are shared by others.

 

Summary and conclusions

  • The curse of knowledge is a cognitive bias that causes people to fail to properly understand the perspective of those who do not have as much information as them.
  • For example, the curse of knowledge can mean that an expert in some field might struggle to teach beginners, because the expert intuitively assumes that things that are obvious to them are also obvious to the beginners, even though that’s not the case.
  • In addition to making it difficult for some experts to teach beginners, the curse of knowledge can negatively influence people in other ways, such as by making it harder for them to communicate effectively, predict other people’s behavior, and understand their own past behavior.
  • You can reduce the curse of knowledge in various ways, including increasing your understanding and awareness of this bias, identifying the perspectives involved and the differences between them, getting feedback from relevant individuals, and using general debiasing techniques, such as slowing down the reasoning process.
  • You can also use these techniques to reduce other people’s curse of knowledge, and you can benefit from accounting for this bias even if you don’t reduce it, such as when predicting people’s teaching ability.