“I’m Right, You’re Wrong; Here’s Why”

When You Are Aware of the Biases the Human Mind Is Prone To, You Are Less Likely to Commit Errors in Reasoning

Discourse, argumentation, debate, and other cognate terms, are the cornerstones of judgment and decision-making. Discourse affects the decisions that we as individuals in a society make on a micro level, with regards to our interactions with our friends, family, peers, colleagues, etc. It also affects decisions we make on a macro level; this refers to the decisions individuals make on behalf of society, such as on a governmental and legislative level. I also confidently propose that our legal systems, political systems, as well as even science itself are all founded and perpetually sustained by discourse; and none of them would exist and operate in the way that they do if we did not have the capacity to partake in rational debate.

However, despite the clear importance and ubiquitous nature of discourse in our lives, discourse as a subject of research has been categorically divided into several research communities spread apart across multiple disciplines—and this has left us with a fragmented view of the empirical findings. One recent finding, however, stands exceptionally relevant to our day-to-day lives; especially in a country with a heated political climate like Lebanon.

In a paper published last year in Cognitive Science, Emmanuel Trouche and his colleagues studied a cognitive mechanism that they aptly named “The Selective Laziness of Reasoning”. In this illuminating study, they presented participants with logical arguments (enthymematic syllogisms) under a choice-blindness paradigm, and the participants had to solve these puzzles and then report their reasoning/arguments for selecting their answers. Here was the trick, however. The participants were later presented with their own arguments under the guise that they were someone else’s, and they had to evaluate them. The alarming finding that the researchers discovered was that almost 60% of people rejected their own arguments when they thought they were someone else’s. This was further exacerbated by the fact that participants were more likely to reject invalid/incorrect arguments when they were presented as someone else’s. I replicated this experiment during my undergraduate studies in Psychology and the results were (perhaps unsurprisingly) similar among students at the Lebanese American University.

What does this all mean, and why is it something we should care about? Well, there are two main issues I would like to draw attention to. In a country such as Lebanon, where seriously divisive political disputes are taking place on a daily basis, it is important for us to constantly remind ourselves of this “selective laziness of reasoning”. When we are in any type of discussion or debate, we have to remember to judge our own arguments with the same level of criticality and scrutiny that we judge the arguments of others with; because as the previously mentioned study and myriad other studies have shown us, human reasoning is not as logical and objective as we wish for it to be. This might seem like a very simple matter, but it holds paramount importance in improving not only our own lives but the lives of members in our society as a whole.

The second issue, which is rather implicit in the original research, is what is referred to as the “belief bias effect”. When participants were presented with arguments that were not their own under the guise that they were their own, they reinterpreted the wrong information in front of them to fit into what they assumed were their own beliefs. This cognitive bias has severely harmful implications; it demonstrates a crucial flaw in human critical thinking skills in which we put aside our own knowledge—which we presumably worked really hard to ascertain—to reformulate it and “re-reason” information in front of us to fit in with what we want to believe.

I would like to conclude with these three “simple” suggestions I want anyone reading this article to keep in mind: (1) There is a human tendency—out of no fault of our own—to be less critical of our own arguments than we are of arguments of other people; so try to apply the same level of scrutiny to your own arguments and beliefs that you would to those of other people. (2) The human mind is prone to errors in critical thinking, so we must always scrutinize every piece of information we see with as objective of a lens as we can. And finally, (3) do not fall victim to the trap of doing mental gymnastics in order not to change your views and beliefs; perhaps the new information you are presented with truly does conflict with your existing notions of a matter. Do not attempt to re-interpret the information in front of you to suit your current stance; use this information to strengthen your own stance by disregarding flimsy arguments.




Disclaimer: The views and opinions expressed in this article are those of the author(s) and do not necessarily reflect the policy or position of the site administration and/or other contributors to this site.

Comments

comments

Daniel Layne Ganama

My primary focus is exploring and analyzing social phenomena through the lens of social cognition; from explanations that include inter- and outergroup relations to power dynamics. In my free time I tweet, play video games, drink wine, and make regrettable drunken amazon purchases.