Introduction

“Security is both a feeling and a reality. And they’re not the same”[1], is the opening statement of Bruce Schneier’s paper on the psychology of security. Rightly so, as people’s perceptions of security often differ from the objective reality of security. For example, the world is becoming an increasingly unsafe place. At least, that is the message we get when listening to speeches of world leaders, or reading their foreign and defence policy strategies.[2] For instance, the Dutch Integrated International Security Strategy of 2018-2022, mentions that the world has become more insecure in relation to certain aspects, like shifts in the balance of geopolitical power, increasing instability and insecurity around Europe and the Caribbean parts of the Kingdom, and a rise in hybrid conflicts and tensions.[3]

However, national and international research has demonstrated that the world actually has become a more safer place if we look at the long term trend: the chance that someone is being killed by violence has decreased significantly in the last centuries and decades. This is not a neat, always declining trend line; peeks like those in 2014 might occur. Nevertheless, the overall trend is still declining.[4] Another example that substantiates the claim that humans’ perceptions can differ from the reality can be derived from the study conducted by Hans Rosling in his book Factfulness: ten reasons we’re wrong about the world – and why things are better than you think. In this book, Rosling shows, on the basis of 13 factual questions about the current situation in the world, that people, including highly educated people, get most of the answers wrong. More worrisome is that a majority of the people even got worse results than when they would have picked answers at random: “chimpanzees, by picking randomly, would do consistently better than the well-educated, but deluded human beings”[5]. He goes on by stating that “every group of people I ask thinks the world is more frightening, more violent, and more hopeless – in short, more dramatic – than it really is”[6]. Hans Rosling is only one of many more that signal these positive trends. Steven Pinker is another one[7] and Max Roser collects all kinds of data to prove the slow, but long-lasting positive developments.[8]

All these researchers and data-analysts show us that people’s subjective perceptions about the world may significantly diverge from reality. In addition to the gap between objective security (reality) and subjective security (feeling) as experienced by people, a disparity of security perceptions can also be identified at the country level. For example, the HCSS comparative study of various foresight reports in ten different countries demonstrates that security perceptions and corresponding threats to the respective national security of the states subjected to research can differ significantly.

The examples from the previous paragraph are manifestations of a broader trend: the existence of a discrepancy between the reality of security and the feeling, and thus the perception of security. Hence, the question arises how the mismatch between these realities and perceptions of security can be explained. This paper will try to contribute to answering this question. To a certain extent these differences can be explained on the basis of geography, culture and history, but another important aspect in explaining these diverse security perceptions can be found in the psychology literature. More specifically, cognitive biases that are held by elites in those states, i.e. the people who are outlining and implementing policy, can help to clarify why certain policies come about and why certain events are perceived to be a threat to national security. Hence, this paper aims to shed a light on the influence of cognitive systems and the corresponding heuristics and biases on security perceptions. In the first place, the paper will provide definitions of certain terms. The authors will elaborate on what is meant by terms such as perception, heuristics and security. After that, a series of cognitive explanations will be put forward, which contributes to clarifying why perceptions on security may differ significantly. We acknowledge that cognitive elements sometimes also have their roots in cultural and historical aspects, but as this goes beyond the scope of this analysis, we will mainly focus on the purely cognitive elements. The paper will conclude with a summary of the main findings.

Definitions

The paper uses the following definitions:

Bias: the difference between actual reality and reality as reported or perceived.[9]

Perception: the act or faculty of perceiving, or apprehending by means of the senses or of the mind; cognition; understanding. Immediate or intuitive recognition or appreciation, as of moral, psychological, or aesthetic qualities; insight; intuition.[10] Perception is a thought, belief or opinion, often held by many people and based on appearances.[11]

Security is both a feeling and a reality. The reality of security is mathematical, based on the probability of different risks and the effectiveness of different countermeasures. The feeling of security is based on your psychological reactions to both risks and countermeasures.[12]

Heuristics are simplifying ‘rules of thumb’ that people use to make a difficult judgement. The reliance on the heuristic can cause predictable biases (systematic errors) in their predictions.[13]

Cognitive system 1 and 2: Two different modes of thinking. System 1 operates automatically and quickly, with little effort and no sense of voluntary control. It draws conclusions based on previous experiences. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration. System 2 is much slower, more precise and only gets activated when System 1 fails to come up with a fast, suitable answer.[14]

Cognitive biases that influence security perceptions

As mentioned in the introduction of this paper, it is not uncommon that a gap exists between the reality of security and the human perception of security. This divergence is the result of intuitive trade-offs made: even though humans are supposed to be good at making security trade-offs, they get it wrong all the time. For instance, Daniel Kahneman, the Nobel prize winning psychologist, conducted research on peoples’ estimates regarding the principal causes of death, after which he compared the results with statistics. One of the results was, example given, that people thought hurricanes were bigger killers than asthma, even though asthma caused twenty times as many deaths as hurricanes did.[15] This is only one of many examples that shows that peoples’ estimate risk assessments often deviate from the actual risk assessment, which implies that their intuitively made assessments are inadequate. In these cases cognitive system 1 thinks it has a quick, satisfying answer, but it is wrong and system 2 should have been activated.

According to Schneier, there are several aspects of the security trade-off that humans might calculate wrong: the severity of the risk, the probability of the risk, the magnitude of the costs, how effective a particular countermeasure is at mitigating the risk, and how well disparate risks and costs can be compared.[16] The more human perception deviates from reality in any of these aspects, the more the perceived trade-off will not match the actual trade-off.[17] In principle, this deviation has a lot to do with biases that are inherent to the human brain. Some of those biases and how they might affect our feeling of security are introduced and explained below. These biases can partly explain the different perceptions countries have with regard to security threats.

Availability heuristic

One of the most common biases, a meta-bias that contains a lot of different biases, is the ‘availability heuristic’. Daniel Kahneman in his book ‘Thinking, fast and slow’ defines this as the process of judging frequency by “the ease with which instances come to mind”.[18] The biases that Kahneman describes give a good explanation for the discrepancy between actual and perceived security. It explains, for example, why people are more afraid to fly immediately after a plane crash occurred and why people get insurance right after major disasters. The examples remain vividly in peoples’ memories for some time. In addition, it can explain why people are disproportionately (as compared to much more common ways to die) afraid of terrorist attacks: the examples are all over the news and frequently repeated. Consequently, people have a lot of examples fresh in their memories and therefore perceive the chance of terrorist attacks as much higher than it actually is.[19] This bias also causes the overestimation of rare events: rare events attract a disproportionate level of attention and the few instances that they actually occur, are all over the news. Moreover, because people remember those rare events vividly, they estimate them as much more common than they in reality are.

In this regard, the regime type of states can be of central importance. For example, in democracies, the existence of a free press, greater policy debates, institutional checks and balances and the fact that more actors are involved in the decision-making process raises the chances that particular tides of information can correct for cognitive biases.[20]

A bias that is part of the availability meta-bias, is the ‘hindsight bias’: “I knew it all along”. The inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events. This bias is also known as the ‘outcome bias’. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.[21] This hindsight bias also leads us to belief that the future is predictable. After all, in hindsight it seems obvious how the course of events led to the now known outcome. The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do.[22] In hindsight, we neglect all the signs that pointed in a different direction and neglect all the other possible outcomes that there seemed to be at the time.

The human mind is not perfect. It’s ability to reconstruct past states of knowledge, or beliefs that have changed, is very limited. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed. Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events.[23] This could cause countries to put more weight on predictions of the future than they should. Knowing the past is not the same as predicting the future.

Negativity bias vs. positivity bias

Another common bias of the human mind is known as the ‘negativity bias’. It is also related to the availability bias. This negativity bias is seen as a fundamental principle of human cognition, in which negative factors have a greater impact than positive factors across a wide range of psychological phenomena.[24] Why do we have a negativity bias? Psychologists believe that the dominance of bad over good emerged as an adaptive trait to avoid lethal dangers in human evolutionary history.[25] However, nowadays it could have some side effects that we should be aware of. For instance this negativity bias could have an impact on the ‘threat sensitivity’ of states: how states identify opportunities and dangers prior to conflict. A heightened reaction to negative information indicating potential dangers compared to positive information suggesting opportunities.[26] The negativity bias helps explain many critical behaviours in international relations, including the security dilemma, threat inflation, the outbreak and persistence of war, loss aversion, the neglect of opportunities for cooperation, and the prominence of failure in institutional memory and learning.[27]

A bias that at first glance seems to contradict with the negativity bias is the ‘positivity bias’: overconfidence about own capacity, abilities, overestimate own control over events, over-optimism about their future prospects.[28] However, this bias could actually go together with the negativity bias because they apply to different contexts. People privilege negative information about the external environment and other actors, but positive information about themselves. The coexistence of these biases can actually raise the odds of conflict. Decision-makers simultaneously exaggerate the severity of threats and are overconfident about their own capacity to deal with the situation – a potential recipe for disaster.[29]

The effect of the negativity bias could be even stronger if it is combined with the Halo-effect: the tendency to like (or dislike) everything about a person or situation (or country) – including things you have not observed[30] (the negative form of the Halo-effect is sometimes called Horn-effect). This is a common bias that plays a large role in shaping our view of people and situations. We can see this kind of effect, for example, with regards to Russia: everything Russia says or does nowadays is viewed from our negative perception of it. An act or statement that would be just fine or viewed neutral if it came from Germany, is viewed very negative and received with suspicion when it is made by Russia. The same Halo-effect, both positive and negative, can be observed with US president Trump: most people, especially in America itself, either dislike (or even hate) everything he says or does, or love it and will defend him no matter what. There seems to be little room for a balanced view in which both positive, negative and neutral statements can be judged on their merits.

Another bias that could even increase both the negativity and positivity bias effect is the ‘confirmation bias’. This bias entails the tendency to look for information that confirms people’s pre-emptive view on the world or a particular person, thereby ignoring any piece of information that contradict those views. People will seek for data that are likely to be compatible with the beliefs they currently hold.[31] If we, again, look at US president Trump: people who dislike everything about him will ignore a smart, sensible decision made by him and the people who love him, will ignore the lies that he tells or just say it is fake news.

Prospect theory vs. utility theory

Another common area in which the feeling of security and the reality vary is the perception of the severity of a certain risk. A simple, but clear example can demonstrate this case[32]:

Research subjects are divided into two separate groups. One group was given the following alternatives:

Alternative A: a sure gain of €1000

Alternative B: a 50% chance of gaining €2000

The other group was given the following choice:

Alternative C: a sure loss of €1000

Alternative D: a 50% chance of losing €2000

According to utility theory, which assumes that actors are fully rational and make trade-offs based on a calculation of relative gains and losses, people should choose alternatives A and C with the same probability and alternatives B and D with the same probability. This follows a simple calculation, which demonstrates that alternatives A and B and alternatives C and D eventually have the same expected utility. However, the outcome of experiments alike contradict the basic assumptions of utility theory: when faced with a gain, the majority of the people prefers a sure gain over a risky gain (alternative A over B), but when faced with a loss, the majority of the people prefers the risky loss over a certain loss (alternative D over C). This difference can be explained on the basis of prospect theory. In contrast to utility theory, prospect theory acknowledges that people have subjective values towards gains and losses. This clarifies why people tend to prefer a sure gain over a chance at a greater gain, while a sure loss is worse than a chance at a greater loss.[33]

The fact that people act this way can be attributed to the framing effect. As the experiment shows, people make different trade-offs when something is presented as a gain as opposed to when something is presented as a loss. The choices people make are affected by how the alternatives are framed.[34] Hence, it can be said that, when a trade-off is framed in terms of a ‘gain’, people tend to be risk averse, while when trade-offs are framed as a ‘loss’, people tend to be risk seeking. The outcome can also be explained by two different elements. On the one hand, people tend to place more value on changes closer to their status quo, in terms of time, than they do to changes further away from their current state. On the other hand, people also attach greater value to something when it is considered to be a potential loss as opposed to when it is a potential gain.[35] This also applies to countries, because countries are led be people.[36] Nowadays we see a trend that certain countries do not view world economics/ globalization as a win-win situation anymore, but rather as a zero-sum game. Countries who see the world through a zero-sum game lens are likely to take more risk to avoid a bigger loss over what they view as a certain loss. America’s protectionist measures could be partly explained by this.

When applying this to field of (international) security, two important implications can distinguished. In the first place, people will trade off more for security that lets them keep what they already possess, than they are willing to risk to get it in the first place. For example, a country will invest more in maintaining control over territory it already possesses, than over territory it can potentially acquire. Secondly, when considering security gains, people are more likely to accept a smaller, but more certain gain, than a chance at a larger gain, but when faced with security losses, people are willing to risk a larger loss as opposed to accepting the certainty of a small loss.[37]

Conclusion

This paper started with a quote from Bruce Schneier: “Security is both a feeling and a reality. And they’re not the same”[38]. Throughout the paper, different kinds of biases that all could potentially influence the decisions people make and alter our judgements, were discussed. In addition, the implications for the field of international relations were outlined. On the basis of the above conducted analysis, it can be said that humans are vulnerable to a various range of biases, which in turn influences the output they produce. Consequently, this also affects the policies and actions of countries, as countries are led by people.

Hence, being aware of the existence of biases is a first step in overcoming them. Biases are usually products from our fast, intuitive cognitive system 1. Nevertheless, by recognising situations in which biases might occur, and by being aware of the most common biases, it is possible to actively mitigate the actual influence that they have on our decisions. We can do this by activating our slower, more calculating cognitive system 2. In other words, we should be more actively involved in questioning our gut feeling in ‘suspicious’ situations.

Notes

B. Schneier, ‘The Psychology of Security’, in: S. Vaeudenay (Ed.), AFRICACRYPT 2008, (Springer-Verlag, 2008), p. 50.
E. E. Duchateau-Polkerman, ‘Hoe perceptie ons veiligheidsgevoel beïnvloedt’, in: Militaire Spectator, 185 (1), (2016), p. 4.
Ministry of Foreign Affairs of the Kingdom of the Netherlands, Working Worldwide for the Security of the Netherlands: An Integrated International Security Strategy 2018-2022, (Den Haag, May 2018), p. 6.
Zie: link; link & link
H. Rosling, O. Rosling & A. Rosling-Rönnlund, Factfulness: ten reasons we’re wrong about the world – and why things are better than you think, (London: Sceptre, 2019), p. 9.
Ibid.
S. Pinker, Better Angels of our Nature (New York: Harper Perennial, 2011).
Zie: link.
K. de Bruijne & E. van Veen, Pride and Prejudice: Addressing Bias in the Analysis of Political Violence (Den Haag, Netherlands Institute of International Relations Clingendael, 2017), p. 1.
Cambridge dictionary, link
B. Schneier, ‘The Psychology of Security’, in: S. Vaeudenay (Ed.), AFRICACRYPT 2008, (Springer-Verlag, 2008).
D. Kahneman, Thinking, fast and slow (New York, Farrar, Straus & Giroux Inc, 2011).
First introduced by psychologists Keith Sanovich and Richard West. Used by Daniel Kahneman, a Nobel Prize-winning psychologist, in his book ‘Thinking, fast and slow’ (New York, Farrar, Straus & Giroux Inc, 2011).
E. E. Duchateau-Polkerman, ‘Hoe perceptie ons veiligheidsgevoel beïnvloedt’, in: Militaire Spectator, 185 (1), (2016), p. 13.
B. Schneier, ‘The Psychology of Security’, p. 52.
Ibid.
Daniel Kahneman, Thinking, fast and slow (New York, Farrar, Straus & Giroux Inc, 2011).
E.E. Duchateau-Polkerman, EMSD Thesis De perceptie van veiligheid, Hogere Defensie Vorming, 2015.
D.D.P. Johnson & D. Tierney, ‘Bad World: The Negativity Bias in International Politics’, in: International Security, 43(3), (2018), p. 120.
Daniel Kahneman, Thinking, fast and slow (New York, Farrar, Straus & Giroux Inc, 2011).
Ibid.
Daniel Kahneman, Thinking, fast and slow (New York, Farrar, Straus & Giroux Inc, 2011).
D.D.P. Johnson & D. Tierney, ‘Bad world: the negativity bias in international politics’.
Baumeister et al., ‘Bad is stronger than good’; and Rozin and Royzman, ‘Negativity bias, negativity dominance, and Contagion’.
D.D.P. Johnson & D. Tierney, ‘Bad world: the negativity bias in international politics’.
Ibid.
Johnson & Tierney, Bad world: the negativity bias in international politics
Ibid.
Daniel Kahneman, Thinking, fast and slow (New York, Farrar, Straus & Giroux Inc, 2011).
Daniel Kahneman, Thinking, fast and slow (New York, Farrar, Straus & Giroux Inc, 2011).
B. Schneier, ‘The Psychology of Security’, in: S. Vaeudenay (Ed.), AFRICACRYPT 2008, (Springer-Verlag, 2008), p. 60.
Ibid.
B. Schneier, ‘The Psychology of Security’, in: S. Vaeudenay (Ed.), AFRICACRYPT 2008, (Springer-Verlag, 2008), p. 60.
Ibid.
Ibid.
B. Schneier, ‘The Psychology of Security’, in: S. Vaeudenay (Ed.), AFRICACRYPT 2008, (Springer-Verlag, 2008),. p. 61.
Ibid., p. 50.