Some time ago it was commonly accepted that people tend to make decisions rationally. Of course some argued that “rationally” is somewhat problematic, cause what with religious people, or in general what with the problem of knowledge? And those points are accurate but they didn’t even nibbled the hubris of scientifically literate people or very experienced practitioners – even in light of the knowledge problem those people thought that they are making rational decisions.
Let me ask you this: My neighbor, Steve, is very shy and withdrawn person. He is always willing to help but at the same time he isn’t very interested in people and reality. Overall he is very decent and humble man with need of structure and order, he also is very attentive to details. How do you see the chances of Steve being either librarian or farmer?
Probably you did what most of people do in such situations – you used the representativeness heuristic (simplified rule of problem solving and thinking based on subjective experience) and measured that Steve is most likely a librarian. That is a natural response of our intuitive thinking, no matter if you are scientist, medical doctor, practically minded worker, lawyer, educated CEO, programmer or… a Jersey-Shore-like character. It is also an obvious mistake if you think about it for a while. In every country there is a certain ratio of farmers to librarians – let’s make a plausible assumption that it is more than 1:1, let’s say 20:1. What our intuitions just did there was ignoring entirely this base statistical probability and replacing it with representativeness heuristic – the description just fit into stereotype of librarians so we naturally presume that if it is representative it has to be more probable.
There are many more of such intuitive heuristics, and don’t get me wrong, those simplified rules are very useful, but sometimes those rules can enter serious system mistakes into our decision making process. One of them is illusion of validity – it is a directly proportional correlation between representativeness of the input and confidence towards our predictions. Even scientist who perfectly know the dangers of such practice still are able to make this mistake.
Next time you will make a decision take a while and consider if your judgment is based on objective probabilities or are you contented with just very small but representative for your prediction piece of data?
If you want to know more about such heuristics google up Daniel Kahneman’s work in this matter. You also can like, share or comment to let me know if you want more of this kind of stuff here.
– Przemek Kucia