Photo: Lansallos beach, Cornwall © Andrew Clifton, 2017

In Praise of "Undisciplined" Scholarship

IN RECENT YEARS, I’ve encountered quite a number of popular books which seek to explain what Mikael Klintman calls “fact resistance”.[1] This term refers to the human tendency to adopt, and persist in holding, a variety of beliefs which are considered (at least, by the authors of such books) to be flatly contradicted by a host of rational arguments, based upon a tremendous weight of objectively well-established, scientifically verifiable facts.

Many of the aforementioned works, I should say, have much to recommend them – but still, I must confess I find some of them rather one-sided and narrow in their approach. By contrast, however, Mikael Klintman’s highly stimulating book, Knowledge Resistance, strikes me as enormously original, insightful and even-handed. Klintman notes at the outset that popular critiques of “fact resistance” typically adopt what he calls an “us v. them” perspective:

Depending on who is talking, ‘they’ can be those who have been indoctrinated into what some argue to be mediaeval reasoning. Or ‘they’ can be those who allow themselves to be deceived by liberal elitism and special interests within the scientific community.

Klintman distinguishes the notion of “fact resistance” from his own, broader concept of “knowledge resistance” and makes a convincing case that the latter is not only more balanced and even-handed in its outlook but also excels both in explanatory power and practical utility.

Whilst he explains, over the course of this book, how the idea of knowledge resistance can further our understanding of such pressing concerns as widespread opposition to consensus views in climate science and the popularity of various conspiracy theories, Klintman also makes a point of applying it, from the outset, to the kind of territorial dogmatism that often prevails in many scientific and scholarly disciplines – including his own. As he puts it:

There is an obvious irony here: scholars in disciplines that study knowledge resistance resist insights from rival disciplines concerning knowledge resistance.

Borrowing an appealing self-description from the late and very distinguished sociologist Steve Rayner, Klintman proudly identifies as an “undisciplined” scholar, arguing that:

We have to allow ourselves to gain insights from what the broadest range of human sciences have to say about knowledge resistance – not just psychology or political science or economics or evolutionary thought, but all of it.

To fully appreciate this book, I think it is important to realise that Klintman does not tend to use the term “knowledge” in the familiar, philosophically normative sense of “justified and true beliefs[2], nor does he adopt the misguided position he calls ‘hardcore relativism’ (which holds, roughly, that all beliefs are ‘equally valid’). Instead, Klintman speaks of ‘knowledge’ in the neutrally descriptive sense of knowledge beliefs, which might, perhaps, have objective merit, or at least, some insight to teach us, if only we looked hard enough.

For Klintman, therefore, we can be “knowledge resistant” towards beliefs that may turn out to be, as a matter of fact, objectively wrong, or at least, unjustified – whenever we reject or dismiss such beliefs, out of hand, for reasons that have nothing to do with a lack of reasonable warrant or objective validity. Such motives, as we shall see, may have a lot to do with the social value of belong to a group which is defined, in part, by a distinctive body of “knowledge beliefs” – and also by opposition to others.

Knowledge resistance, on this view, is manifested in a wholesale dismissal of rival knowledge claims, regardless of any justification or warrant they may, or may not have. According to Klintman, this phenomenon is quite the opposite of scepticism – properly understood, as the reluctance to accept unproven claims until they have been tested, challenged, and found to be objectively robust (coupled, I should add, with a willingness to undertake such investigations and modify one’s views, depending on the outcome!). By contrast to scepticism, as Klintman puts it: “To resist knowledge is to be almost immune to evidence, arguments or the experience of others.”

One of the central arguments of Klintman’s book is that, unlike scepticism, knowledge resistance is something that almost everyone does, at least to some degree. To illustrate this point, he cites research which finds that: “Liberals and conservatives are similarly motivated to avoid exposure to one another's opinions.[3] This was the title of a ground-breaking paper published in 2017, reporting the results of a series of experiments which show that people on opposing sides of contentious debates are averse to learning about the views of their ideological opponents. The researchers found that this lack of interest “was not due to already being informed about the other side or attributable election fatigue. Rather, people on both sides indicated that they anticipated that hearing from the other side would induce cognitive dissonance (e.g., require effort, cause frustration).” Statistical analysis of all five studies “did not detect a difference in the intensity of liberals’ and conservatives’ desires to remain in their respective ideological bubbles.”

The principal question that arises from such findings, of course, is why do we do this? In the course of his research for this book, Klintman sought out ideas, hypotheses and possible explanations of knowledge resistance by conducting interviews with scientists and scholars in a variety of disciplines – during which he made a point of posing the following, rather provocative question: “Tell me about the benefits of knowledge resistance?”. In the remainder of this article, I’ll outline some of the answers and insights Klintman gathered, from a variety of different perspectives.

Knowledge Resistance as Tribal Loyalty

In 1978, the social anthropologist Mary Douglas published a classic study of religious and cultural taboos which examined some of the distinctive beliefs which characterised the major monotheistic religions in the Middle East, at the time of their founding.[4] What she sought to understand were the origins of such taboos as the belief in the spiritual impurity of certain animals; the wickedness of creating and worshipping ‘idols’, or physical representations of the divine; and the sinfulness of such seemingly innocuous actions, planting trees of different species close together, or wearing cloth made of both wool and linen.

What she discovered was that these beliefs have one key feature in common: they sharply contrast with, and denigrate, the customs, beliefs and spiritual practices both of previous religious cults and of the monotheists’ nearby, polytheistic neighbours. In other words, it appeared that these taboos and prohibitions had a powerful social function, which is neatly captured in one of Klintman’s chapter titles: If you’re with us, don’t believe them.

As Klintman explains, this social motive for knowledge resistance extends far beyond the realm of religious belief. Indeed, one of the most striking comments that he quotes from his variously specialised interviewees comes not from a sociologist, psychologist or political scientist, but rather from an AI researcher named Alan Defoe; and yet, his point is very general indeed:

Your refusal to say that the emperor has no clothes is a signal that you belong to the group.

Another observation by Defoe expands on this idea:

A lot of beliefs are not epistemic beliefs. A lot of beliefs are loyalty statements. To most humans, it doesn’t matter to their daily lives what the truth value is of many big claims about the world. What does matter is that their group welcomes them and feels like they are a loyal member.

To me, this notion explains not only the well-attested futility of attempting to persuade profoundly knowledge-resistant people by providing them with more and more arguments and ‘facts’; it goes further than this, and suggests plausible reasons why cultish groups tend to adopt and promote ideas which almost everyone outside the group finds totally absurd and incoherent – thus inclining them to view such group members as deluded idiots.

For the group and its members, however, this negative attitude from outsiders isn’t a bug, it’s a feature. To publicly declare one’s commitment to the view, for example, that the Earth is flat, or that the moon landings were faked, or that all the world’s governments are controlled by shape-shifting lizard people, or indeed (most darkly of all), that the Holocaust never happened, is to take a position which sharply isolates you from outsiders – and for this very reason, is warmly welcomed by your co-believers as a badge of loyalty.[5] As Klintman puts it:

The more the knowledge beliefs of one group deviate from those of another group, the stronger the social cohesion will be within each group. In other words, the more outrageous and unsubstantiated a knowledge claim is that members of a particular community hold, the more profound loyalty the members show the others in that community.

Klintman is keen to point out that this kind of dynamic is not confined to the thinking of people whose beliefs radically defy mainstream scientific opinion. On the contrary, it is frequently manifested in the innumerable interdisciplinary rivalries and disputes that plague the halls of academe.

The Evolution of Knowledge Resistance

Klintman argues that our tendency to adopt and reject knowledge beliefs for reasons of social loyalty may be explained, at least in part, by our evolutionary development as social animals. He points out that for 99.5% of our history, members of the genus homo have lived in small, tribal groups of hunter gatherers, profoundly dependent upon one another for their survival. Co-operation was essential to every aspect of our lives from hunting and gathering food to childcare and mutual protection from violent attacks by rival groups. Indeed, encounters with outsiders were so inherently dangerous that, according to some estimates, males in hunter-gathering societies had at least a one in four chance of violent death. [6] For all of these reasons, as Klintman suggests:

People who had a higher ability to distinguish their in-group (‘us’) from out-groups (‘them’) would have been better adapted to survive and reproduce. It was also adaptive to be seen by ones in-group as someone who really belonged.

A crucial step in Klintman’s argument is that, for the foregoing reasons, it would also be an advantage to develop a capacity for self-deception– unconsciously accepting and adopting tribal beliefs as our own, rather than merely ‘playing along’ and professing them, dishonestly, for reasons of consciously calculated self-interest. He refers to the observation, by psychologists, that body language and other nonverbal cues provide such tell-tale signs of insincerity that, in order to reliably signal our in-group bona fides, it’s a whole lot safer, if we have a strong tendency to fool ourselves. [7]

Obviously, the vast majority of modern humans no longer live in close-knit, hunter gather societies, but the legacy of this very long span of recent evolutionary history is still with us. Abundant evidence for this is afforded by our increasing understanding of the harmful effects upon physical and mental health of loneliness and social isolation – which makes perfect sense, in view of the natural anxiety and stress with which our ancestors would have viewed the prospect, or experience, of being excluded from the group. [8]

Notwithstanding these powerful arguments, Klintman is keen to point out that our natural human tendency to devolve into mutually knowledge-resistant tribes is neither an irresistible force, nor any sort of moral imperative; on the contrary, it’s always possible see beyond our tribal boundaries. As an example of this, he quotes with heartfelt approval from an individual he describes as “a leader of one group of tribes”:

If science proves some belief of Buddhism wrong, then Buddhism will have to change. In my view, science and Buddhism share a search for the truth and for understanding reality.

Tenzin Gyatso, 14th Dalai Lama, New York Times, 12 November 2005.

TO BE CONTINUED…

The second part of this review will follow shortly.

NOTES & REFERENCES


  1. See: Gilovich, T. (1993) How We Know What Isn't So: Fallibility of Human Reason in Everyday Life; Shermer, M (2007) Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time; Specter, M (2009) Denialism: How Irrational Thinking Hinders Scientific Progress, Harms the Planet, and Threatens Our Lives; Bardon, A (2019) The Truth About Denial: Bias and Self-Deception in Science, Politics, and Religion; McIntyre, L (2019) The Scientific Attitude. ↩︎

  2. This notion is commonly attribute to Plato, but it has only been widely discussed by philosophers since the publication of a very short paper in 1963 by Edmund Gettier, which appears to refute it! Of the various attempts to defeat Gettier’s argument, one of the strongest, in my view, is Ernest Sosa’s theory of “Apt Belief”. See: Gettier, E. (1963) Is Justified True Belief Knowledge? Analysis 23 (6):121-123; Sosa, E (2009) A Virtue Epistemology: Apt Belief and Reflective Knowledge, Volume I; Sosa, E (2011) Reflective Knowledge: Apt Belief and Reflective Knowledge, Volume II ↩︎

  3. Frimer, J. A., Skitka, L. J., & Motyl, M. (2017). Liberals and conservatives are similarly motivated to avoid exposure to another’s opinions. Journal of Experimental Social Psychology, 72, 1–12. ↩︎

  4. Douglas, M (1978) Purity and Danger: An Analysis of the Conception of Pollution and Taboo. ↩︎

  5. Simler, K and Hanson, R (2018) The Elephant in the Brain. ↩︎

  6. LeBlanc, S. A. (2003) Constant Battles: The Myth of the Peaceful, Noble Savage. ↩︎

  7. Von-Hippel, W. and Trivers, R (2011) The Evolution and Psychology of Self Deception. Behavioral and Brain Sciences 34 (1):1 ↩︎