El blog reúne material de noticias de teoría y aplicaciones de conceptos básicos de economía en la vida diaria. Desde lo micro a lo macro pasando por todas las vertientes de los coyuntural a lo más abstracto de la teoría. La ciencia económica es imperial.
People aren't as rational as we would like to think.
From attentional bias — where someone focuses on only one or two of several possible outcomes — to zero-risk bias — where we place too much value on reducing a small risk to zero — the sheer number of cognitive biases that affect us every day is staggering.
Understanding these biases is key to suppressing them — and needless to say, it is good to try to be rational in most cases. How else can you have any sort of control over investments, purchases, and all other decisions that you make in your life?
To convey the breadth of cognitive biases, we've picked out 57 of the most notable ones from a much longer list on Wikipedia. [Aimee Groth contributed to an earlier version of this article.]
Where people overestimate the importance of information that is available to them.
One example would be a person who argues that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day, an argument that ignores the possibility that his grandfather was an outlier.
A bias where people make faulty conclusions based on what they already believe or know. For instance, one might conclude that all tiger sharks are sharks, and all sharks are animals, and therefore all animals are tiger sharks.
A bias in which you think positive things about a choice once you made it, even if that choice has flaws. You may say positive things about the dog you just bought and ignore that the dog bites people.
This is the tendency to see streaks or clusters in random events. A gambler after watching a red come up multiple times in a row on a roulette table may erroneously conclude that red is hot. In a related bias, known as cognitive bias, the gambler may conclude that black is particularly likely to come up since it hasn't come up in awhile. In fact, the results are always random.
Where people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the earth was round because they tended to believe earlier information that it was flat.
When people who are smarter or more well informed can not understand the common man. For instance, in the TV show "The Big Bang Theory" it's difficult for scientist Sheldon Cooper to understand his waitress neighbor Penny.
Where people in one state fail to understand people in another state. If you are happy you can't imagine why people would be unhappy. When you are not sexually aroused, you can't understand how you act when you are sexually aroused.
When weak but consistent data leads to confident predictions. Like one commenter noted on the MIT admissions blog:
Why is MIT's admissions process better than random? Say you weeded out the un-qualified (the fewer-than-half of applicants insufficiently prepared to do the work at MIT) and then threw dice to stochastically select among the remaining candidates. Would this produce a lesser class?
Investing more money or resources into something based on prior investment, even if you know it's a bad one. "I already have 500 shares of Lehman Brothers, let's buy more even though the stock is tanking."
The tendency to put more emphasis on negative experiences rather than positive ones. People with this bias feel that "bad is stronger than good" and will perceive threats more than opportunities in a given situation.
Our expectations unconsciously influence how we perceive an outcome. Researchers, for example, looking for a certain result in an experiment, may inadvertently manipulate or interpret the results to reveal their expectations. That's why the "double-blind" experimental design was created for the field of scientific research.
People take action in response to extreme situations. Then when the situations become less extreme, they take credit for causing the change, when a more likely explanation is that the situation was reverting to the mean.
Over-reliance on expert advice. This has to do with the avoidance or responsibility. We call in "experts" to forecast, when in fact, they have no greater chance of predicting an outcome than the rest of the population. In other words, "for every seer there's a sucker."
An error that comes from focusing only on surviving examples, causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven't heard of all of the entrepreneurs who have failed.
It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck or other factors.
We overuse common resources because it's not in any individual's interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest.