(please forgive the poor video quality, the only way I could shorten the video to under 10minutes was to convert and clip the last few minutes).
Previously, I've shared thoughts on decision making. In this instance I'll cover an interesting neurological side effect of our conscious mind, cognitive biases.
A cognitive bias is a pattern of deviation in judgment that occurs in particular situations (see also cognitive distortion and the lists of thinking-related topics). Implicit in the concept of a "pattern of deviation" is a standard of comparison; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts. The existence of some of these cognitive biases has been verified empirically in the field of psychology, others are widespread beliefs, and may themselves be a consequence of cognitive bias.
Cognitive biases are instances of evolved mental behavior. Some are presumably adaptive, for example, because they lead to more effective actions or enable faster decisions. Others presumably result from a lack of appropriate mental mechanisms, or from the misapplication of a mechanism that is adaptive under different circumstances.
Prompted by Mark Davidson's post on friendfeed, I happily dug through the cognitive bias list from wikipedia. The question it inspired, can simply being aware of our own cognitive biases help us to make more effective decisions? Aren't our biased decision making processes the essence of our persona, our style, and our mind? Even if we were fully aware of all of our biases, we'd still need to know precisely how much they weigh in on our decisions in order to be less affected by them.
Cognitive biases are not only part of what defines us. In a way they are a form of mild self delusion, or self deception. Evolutionary forces pushed our minds to develop decisions to speedup the process. We are forced to make many decisions in our lifetimes. Getting hung up on something that is a close call or relatively unimportant, is easily resolved in favor of one of our cognitive biases. What we'd like to avoid, is making heavily biased decisions on a regular basis when it is in our best interest to make these decisions with minimal self bias. While I'm not confident that this can be accomplished, I do see value in being cognisant of some commonly described biases.
The following lists are how I like to envision/cluster some of the cognitive biases on wikipedia's list.
Related to self or familiarity bias:
- Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
- Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
- Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
- Experimenter's or Expectation bias — the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.
- Wishful thinking — the formation of beliefs and the making of decisions according to what is pleasing to imagine instead of by appeal to evidence or rationality.
- Selective perception — the tendency for expectations to affect perception.
- Availability heuristic — estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
- Clustering illusion — the tendency to see patterns where actually none exist.
- Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
Incorrect weighting of available information:
- Extraordinarity bias — the tendency to value an object more than others in the same category as a result of an extraordinarity of that object that does not, in itself, change the value.
- Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
- Framing — by using a too narrow approach or description of the situation or issue. Also framing effect — drawing different conclusions based on how data are presented.
- Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.
- Anchoring — the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions.
- Attentional bias — neglect of relevant data when making judgments of a correlation or association.
- Authority bias — the tendency to value an ambiguous stimulus (e.g., an art performance) according to the opinion of someone who is seen as an authority on the topic.
- Gambler's fallacy — the tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the normal distribution. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater
The Lazy or momentum bias:
- Semmelweis reflex — the tendency to reject new evidence that contradicts an established paradigm.
- Status quo bias — the tendency for people to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).
And a few more of my favorite biases:
- Information bias — the tendency to seek information even when it cannot affect action.
- Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
- Reactance — the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
Related articles by Zemanta
- Decision-Making Biases (daveduarte.co.za)
- How to Deal with One Important Barrier to Effective Mediation (bizop.ca)