Why We Are Easily Deceived

Why We Are Easily Deceived

In last week’s article I looked at how self-deception, which is something we engage in on a daily basis, concerning a variety of potential threats, can put our personal safety at risk. In this article I want to look at some of the reasons why as a species we ae so susceptible to other people’s deception. We may like to believe that we are skilled at identifying deception – partially that which is a result of our own ability to deceive ourselves (which is part of the problem) – but in reality, this is not the case i.e., on average we are no better than chance at identifying deception; we would do as well to toss a coin, as trust our intuition, when it comes to believing someone etc. Our confidence in our ability to identify deception has no impact on our ability to actually do so. Strangely those in law-enforcement and those who work in the criminal justice system become less adept at identifying deception, the more experience they have dealing with those who engage in it – statistically a rookie cop is better at discerning whether someone is telling the truth, than a more experienced colleague/counterpart. It would seem that over time we learn to be deceived better than we intuitively learn to identify deception. This means that we can’t rely on our natural abilities to identify deception but rather need to learn how to discern, dissect and identify what is actually the “truth”.

                When understanding why we are so bad at uncovering deception, we need to understand why it benefits us to believe what other people say and do. The human mind has evolved in a way that allows us to make quick decisions, and take advantage of opportunities as they present themselves; this can be at the expense of a full and thorough analysis of all possible eventualities. To be able to do so, we must work from the premise that by and large things are how they seem, and that what is being presented to us is “real” and “true” i.e., we have a truth bias. If we were to fully question every choice we were presented with and make a thorough investigation before we made a decision, we’d get nothing done. Instead, we do a quick search for those things that are deemed the most relevant and important, and base our decisions on these things. However, this quick and dirty approach may mean that we fail to take into account something that is actually relevant because it isn’t seen as particularly pertinent when we take our initial look at things. Working this way has proved to be largely beneficial i.e., statistically a quick decision based on a few significant factors delivers/rewards more than a slower more in-depth study of all the variables etc. Whilst considering any evolutionary advantage, it should be recognized that such advantages benefit the species as a whole rather than the individual in each specific situation e.g., running away from a danger may have a 95% success rate, making it a good general/overall strategy when confronted by a threat, however it isn’t a perfect strategy, and there may be 5% of situations where it is detrimental to those employing it – a person runs into traffic, off a cliff, or towards a greater danger etc. From a species perspective a 5% failure rate is probably acceptable; if a particular immediate response is able to “save” 95% of its members, then the loss factor can be tolerated. Having a truth bias works well for us most of the time, and so there is a benefit to us believing what people say, rather than questioning them.

                    Whilst the mind never fully stops developing, by our later teenage years many of our views concerning the way in which the world works and operates have started to become more firm and rigid. Our search for “new” information at this point and beyond is largely seeking confirmation of our views and beliefs rather than challenging them. This can make us extremely susceptible to deception e.g., if a person presents deceptive information that conforms to our belief systems, we are more likely to believe and accept it than question it etc. This is why certain people may have “blind spots” that others can exploit. Those who are skilled in deception will often test for these e.g., they may appeal to a perceived “shared/common” value system in order to make themselves appear trustworthy. We are more likely to put trust in a person and hand over control of a situation to them if we believe that they think like us, and possibly are like us etc. Such confirmation biases usually lead to no negative consequences, however there are times we may want to question and inquire whether someone is using such biases against us. We should also acknowledge that we have a tendency to work with availability heuristics e.g., if a memory of something that resembles a situation we are facing now comes quickly to mind, we are more likely to believe it because it is closer/nearer to our conscious mind than something we have to think about and search for i.e., our “speed” of thinking can be to our detriment as well as to our advantage.

                Whilst it would be easy to judge our inability to recognize deception as a major evolutionary flaw, the positives of having a “truth bias” overall outweigh such disadvantages. However, by understanding how our truth and confirmation biases work, along with such things as an availability heuristic etc., we should recognize that there are times/moments when we should be more inquisitive and question the information we are being provided. If we can acknowledge to ourselves that we are not good at detecting deception, and that this is an innate part of our character, we may stop relying wholly on “intuition” for keeping us safe.

Share:
Krav Maga Blog Author Gershon Ben Keren
Gershon Ben Keren
2.8K Followers

Gershon Ben Keren, is a criminologist, security consultant and Krav Maga Instructor (5th Degree Black Belt) who completed his instructor training in Israel. He has written three books on Krav Maga and was a 2010 inductee into the Museum of Israeli Martial Arts.

Click here to learn more.