At least since Socrates, philosophers have been thinking about what, if anything, we can know, or at least reasonably believe. But, on the surface at least, there seems very limited agreement about the upshot of this long philosophical conversation.

Some say that in science we have a method of determining what it is reasonable to believe. Science is set up to give beliefs maximal exposure to potentially hostile evidence. In that environment, theories have to be very fit to win the Darwinian struggle to survive. But sometimes, as in the current American debate on Creationism, a challenge is mounted to the scientific method of empirical testing itself. Those who think the Bible is the Word of God can see it as trumping mere human empirical observation. Others point to the philosophical assumptions on which the scientific method rests: the (general) reliability of our senses, the legitimacy of extrapolating from past experiences to universal generalizations also covering the future, and so on. They point to the relative lack of agreement among philosophers as to how to rebut these sceptical challenges to the assumptions underlying science. Sometimes they go on to claim that science is not the means of adjudicating between beliefs, but just another ideology, the local belief system of the Western world.

Pessimism about resolving the deepest differences of belief can, at one extreme, come from the lack of consensus on how to meet these philosophical challenges. At the other extreme, it can come from the brute empirical fact that many people hold their religious or political ideologies with dogmatic rigidity, preferring to execute sceptics rather than to argue with them. Somewhere in between these extremes is another reason for scepticism that I want to focus on here. This does not appeal to the general difficulty of meeting philosophical doubts about basic assumptions, nor to the brute fact that there exist many dogmatic fanatics. Instead it appeals to what may be a weakness of the human mind, on display when those with different systems of belief do try to argue with each other. The weakness is manifested in what can be called “structural distortions” and is linked with the holistic nature of belief systems. It contrasts with other cognitive limitations or distortions that can be called “local”.


A lot of work in cognitive science teases out distortions in the way we think: the valid inferences we find counterintuitive and the fallacies we find seductive. Some distortions affect how we assess evidence. We are too influenced by the salient case involving someone famous or someone we know. The pull of the first case we come across can distort interpretation of later cases. We accept evidence fitting our expectations more readily than evidence telling against them. Stereotypes can influence us more than probabilities. In guessing whether someone described is a farmer or a librarian, we give more weight to whether the description fits our picture of a librarian than to the fact that farmers vastly outnumber librarians. And so on. i

Once we know about them, we can start to correct for these local cognitive distortions in the way we correct for perceptual distortions. But structural distortions are less easy to deal with. To understand what they are requires looking at how beliefs form systems in which they interact with each other in a holistic way.


When things do not turn out as our beliefs led us to expect, this requires us to change our mind about something. But, as it is platitudinous to point out, we have a degree of free play about which beliefs to question or change. I go to the doctor, expecting to be prescribed medicine that will cure my illness. I take the medicine but do not get better. When my expectation turns out to be mistaken, what change in my beliefs should follow?

I may decide that the doctor is less good than I thought. But there is a range of other options. Perhaps I should revise my belief in the competence of the pharmacist, who may have made up the prescription wrongly. Or perhaps I have a peculiar chemistry, so that medicine effective for others does not cure me. Other revisions of belief have wider implications. Perhaps I should lower my estimate, not just of this doctor, but of General Practitioners as a whole? Or perhaps I should think less of the medical profession in general? Or perhaps I should give up my belief in the whole of Western medicine and the scientific method on which it is based?

No doubt some of these responses are more reasonable than others. But, because our beliefs hang together in a system, I have considerable choice about what to revise. This holism gives great scope to lawyers, for instance when defending a doctor against accusations of negligence or incompetence.

It also gives great scope to people who want to defend the central tenets of a belief system. Any belief can be saved from refutation if you make enough adjustments in the rest of the system. The flat earth can be defended if you postulate that satellite photographs are affected by systematic distortions of light, that people have hallucinations about ships disappearing over the horizon, etc. Philip Gosse’s strategy for defending Creationism against the fossil evidence for evolution was to suggest that God had planted the fossils to look as if evolution had occurred in order to test our faith. This is a typical “framework distortion”. There is no local cognitive failure, such as giving too much relative importance to larger fossils, but rather the pull of a commitment that systematically distorts the framework for interpreting whatever evidence comes in.

With Gosse’s strategy the absurdity is obvious. But moves of this kind seem all too easy in defending ideological commitments. If you want to keep one point fixed (such as the infallibility of the Pope or of the Koran) all you have to do is adjust everything else to fit.

In 1939, there was a debate in the Central Committee of the British Communist Party. Many members had joined the Party because they saw it as the group most determined in opposition to Nazism. The Hitler-Stalin pact had been extremely difficult for them to swallow and they had defended it as an attempt to buy time against the Nazi threat. But, in the wake of the pact, orders came from Moscow that the Party was to stop supporting the war. They were to treat Nazi Germany and those it had attacked as morally equivalent, and even to work for Britain’s military defeat by the Nazis. The debate in which they tried to deal with the cognitive dissonance this created is available verbatim, as the room was bugged both by Moscow and by MI5. ii

Some at first rejected the new line (though in a few weeks they all came round to it). Those who supported the new line did so against all their previous anti-Nazi convictions. The debate makes clear that, for them, the fixed point they had to preserve was faith in the rightness of the Soviet Union. They adjusted everything else to fit this.

It would be easier to accept the new Soviet line if democratic countries and Nazi Germany were not importantly different. (Palme Dutt asserted this and Dave Springhall said there was “little to choose between Hitler and Chamberlain”. Idris Cox said British colonial oppression was worse than Nazi concentration camps.)

The Soviet line would be easier to accept if Germany was too weak to be much of a threat. This adjustment must have been hard to make. In the previous three years, Hitler had occupied the Rhineland, taken over Austria, faced down Britain and France to take part of Czechoslovakia, and successfully invaded Poland. (But Palme Dutt said that Germany had been forced to abandon its leadership offensive against the Soviet Union, and William Rust said that “the plain fact is not the power of Germany but the weakness of Germany”.)

Another adjustment that would ease the acceptance of the Soviet line would be if Britain and France were more dangerous aggressors than Nazi Germany. (Idris Cox said that British imperialism was the main danger to the Soviet Union. Palme Dutt said that Nazi Germany was so weak that it was “desperately searching” for peace and that the real fear of Britain and France was “how not to defeat Germany too severely, in such a way as to let loose the… real socialist revolution in Germany”.)

One question is whether the holism of a set of beliefs has the depressing result that rational discussion has no means of creating agreement or even of weakening anyone’s ideological beliefs. Are there ways of showing that, at a certain point, “adjusting everything else” becomes irrational? Can we recognise when the boundary between rational and irrational adjustment has been crossed?


The delusions that occur in psychiatric disorders are relatively clear cases of irrational beliefs. Although most such delusions are false, they do not have to be. Perhaps some paranoid people really are persecuted. What is crucial is that the beliefs are arrived at and maintained by processes that would generate truth only by chance. (Sometimes the major causal explanation may be at the neurological or pharmacological level. But here my concern is psychological: with the changes in thinking rather than in the dopamine level.) 

One psychological account refers to “poor reality testing”. The local distortions that cognitive psychology says distort “normal” assessment of evidence may be greatly magnified. On another account, delusions are a rational attempt to make sense of experiences generated by some neurological or chemical failure. The “voices” people hear are said to result from a breakdown of mechanisms for distinguishing real from imagined sounds. People then ascribe to external speakers sounds actually only imagined. A breakdown in the mechanisms monitoring our actions or thoughts may remove the feeling of agency. This may be rationalised as our actions or thoughts being controlled by someone else. Similarly with Capgras’ delusion: the belief that someone (often a spouse or partner) has been replaced by an identical impostor. There may be a breakdown in the usual emotional response to seeing someone you care about. What you see says it is your wife or partner. But that does not feel right as you have no warm response. This conflict is rationalised by the thought about the identical impostor.

Neither cognitive distortions nor the idea of a rational response to unusual experiences accounts for delusions often being so bizarre. The person who thinks Prince Charles is invading his mind is not just weak at assessing evidence. If I usually warm to you, not when I see you today, there are many possible explanations. But one I will not be tempted by is the identical impostor. As a convincing story, it ranks well below “the dog ate my homework”. Capgras’ delusion involves not only the loss of an emotional response. Like many delusions it also violates the normal constraints of plausibility. 


Although the causes are different, the ideological cases quoted and the psychiatric cases do have in common this failure of the normal sense of plausibility. There is some such failure behind the belief that someone I am close to has been replaced by an identical impostor. But something has also gone wrong when someone thinks that God planted the fossils to test our faith, or that in 1939 a weak Germany was desperately searching for peace. Both the psychiatric and the ideological cases contrast with the constraints normally set to holistic adjustments by an embedded sense of plausibility. These plausibility constraints are given by the core structure of a belief system. This core structure takes two forms. There are framework principles and there is the “anchoring context”. Structural distortions can affect either or both.


What marks off framework principles from other beliefs? A first rough approximation is to say that they are the ones that regulate how the rest of the system is held together. 

Some lay down rules for the methods by which other beliefs should be assessed. These can include rules of logic or scientific norms governing the testing of theories and assessment of evidence. Such principles may be advocated consciously (for instance by logicians or philosophers of science). Or they may be unconsciously and intuitively adopted (for instance by bright argumentative children, or by good but un-philosophical scientists). They are like the load bearing walls of a house. Change these and the whole system is likely to be affected, or even to collapse. The debate between Darwinism and Creationism is utterly different according to whether the regulative principle adopted requires consistency with the word of God or admits as relevant empirical findings only.

Other framework principles assign a special status to the “axioms” of a belief system. These axioms are adopted or accepted as not being open to challenge. They may be the central beliefs of a religion, such as belief in the existence of God, the belief that we know God’s commands, or the belief that we should obey them. For others, they may be the fundamental tenets of a philosophical or political or psychological system such as dialectical materialism or psychoanalysis. The axioms often have empirical or metaphysical content, but they also have a methodological or regulatory function. They are a kind of Supreme Court or “Grundnorm” in a belief system. In cases of conflict with them, other beliefs have to give way. A striking instance of a framework principle about such axioms is Cardinal Newman’s claim about his religious beliefs that “ten thousand difficulties do not make one doubt”. iii


Each person, after an early age, has a large number of beliefs that together form a context we take for granted. This context is presupposed when someone decides to accept or reject other claims. Most of us share a large number of these embedded beliefs, often without being able to list them or even being conscious of them. I may not notice my belief that cows cannot fly until I have to assess the reliability of someone who reports seeing cows circling above Trafalgar Square. Some of our embedded beliefs are like “cows cannot fly”: beliefs of a very general sort about what the world is like. Others are more particular, so entangled with our personal life that we cannot seriously suppose we are wrong about them, for instance what we believe about our own name, or where we live, or the language we are speaking.

Wittgenstein, who drew attention to both these types of deeply embedded beliefs, thought that we cannot seriously doubt them: “The truth of certain empirical propositions belongs to our frame of reference” and sometimes such a belief “is anchored in all my questions and answers, so anchored that I cannot touch it”.iv He pointed out that, if people make claims that go against these embedded beliefs, rather than accept that they believe what they seem to assert, we may wonder if they understand what they are saying.

Similarly, writing about someone who denies a true belief inextricably entangled with his personal life, he says, “If my friend were to imagine one day that he had been living for a long time past in such and such a place, etc. etc., I should not call this a mistake, but rather a mental disturbance, perhaps a transient one.”v

There is obviously truth in Wittgenstein’s idea of the anchoring context. But there is a real issue about how confident we are entitled to be about particular beliefs that seem to be part it. This is shown by a belief that Wittgenstein cited as part of it: “If we are thinking in our system, then it is certain that no-one has ever been on the moon. Not merely is nothing of the sort ever seriously reported to us by reasonable people, but our whole system of physics forbids us to believe it. For this demands answers to the questions, “How did he overcome the force of gravity?” “How could he live without an atmosphere?” and a thousand others which could not be answered.”vi This was in 1950, only nineteen years before the moon landing. Perhaps our thoughts do make sense only against a backdrop of presupposed beliefs. But the history of this presupposition suggests we cannot assume that any particular one is “so anchored that I cannot touch it”.


These two kinds of beliefs that provide plausibility constraints –framework principles and beliefs so deeply embedded that they provide the anchoring context- need not be sharply marked off from other beliefs. How deeply beliefs are embedded is likely to be a matter of degree. And norms governing the assessment of beliefs may vary from some that are obviously part of the “framework”, such as principles of logic or statistics, to borderline cases such as “take account of demeanour and body language when assessing the honesty of the person giving evidence”. We may not need to invoke the kind of chasm that once was thought to hold between analytic and synthetic statements. The more appropriate model may be of being closer to one end or the other of a continuum.

This suggests that, while some distortions of thinking are clearly local and some are clearly structural distortions, others may be borderline cases. A structural distortion involves the pull of a commitment that systematically distorts the interpretation of evidence. There are degrees to which a distortion is systematic. Cognitive psychologists say it is normal to have “confirmation bias”: greater readiness to accept evidence that fits with rather than goes against what we already believe. Often this may operate rather weakly in defence of some belief marginal in a person’s thinking. My reluctance to accept that parts of Scotland are as warm in winter as the South coast of England is too weak and too marginal in my thinking to count as a framework distortion. But the case of the British Communist Party in 1939 and the case of the response to the fossil evidence for evolution are striking just because the degree of commitment gave rise to a distorting pull both unusually powerful and unusually systematic.


It is natural to think of some ideological differences as being linked to people having different conceptual schemes. Some belief systems involve the use of categories and concepts that do not figure in others. The use of words and phrases like “blasphemy”, “treason”, “reactionary”, “miscegenation” “un-natural vice”, “homophobic”, “mortal sin”, “un-American”, “infidel” or “Aryan” (not in quotation marks or to report on the views and practices of others) reveal a glimpse of a belief system. Oscar Wilde intentionally signalled something about his own beliefs when he answered a question by saying that “patriotism” was not one of his words.

Donald Davidson famously argued against the existence of different conceptual schemes. He followed Quine in saying that translation and the ascription of belief are interwoven. If we can say anything about a group’s beliefs and concepts, their language or conceptual scheme and ours cannot be so radically different that nothing said in one can be translated into the other. This does not conflict with the more modest account of rival conceptual schemes needed here. Users of the term “un-natural vice” have a different conceptual scheme from users of the term “homophobic”. But this is a local phenomenon. In this area, they use different concepts in formulating their beliefs.

Even these local conceptual rivalries may presuppose a “dualism of conceptual scheme and empirical content” to which Davidson objected.vii A fundamentalist Christian Minister walks in on a scene of gay or lesbian sex. As he recounts it, “I saw a scene of the most perverted un-natural vice and expressed God’s wrath, calling down the punishments promised in the Holy Bible for such sins”. As they recount it, “This religious maniac came in and started shouting a disgusting homophobic rant”. It is the same scene, so the descriptions are trying to capture the same empirical content. But they use rival concepts to structure or organize that content.

Davidson disputed the metaphor of “organizing”. To organize something (such as a cupboard) you have to organize its contents or components. He claims that to say that two descriptions have the same content but are organized differently, there has to be some shared vocabulary for the components to be organized. This is right. Some vocabulary is shared by the Minister and by the gay lovers. They can agree about some of the actions involved, (shouting, and perhaps denunciation) but these are organized in terms of the rival concepts of expressing God’s wrath and homophobic rant. The defensible version of there being rival conceptual schemes accepts some shared concepts and beliefs as the background context. The failures of translation are local rather than global.

Perhaps there is no sharp line between different beliefs and different conceptual frameworks. Here I will keep to the terminology of rival frameworks, leaving open the question of the relative contribution to them of different beliefs and different concepts.


Karl Popper argued for the possibility of fruitful dialogue between people with very different frameworks. He attacked what he called “the myth of the framework”, which he stated in a sentence: “A rational and fruitful discussion is impossible unless the participants share a common framework of basic assumptions or, at least, unless they have agreed on such a framework for the purpose of the discussion.” His attack on it was fierce, partly because of his moral revulsion against the consequences of creating obstacles to dialogue. For him, the myth of the framework was “not only a false statement, but also a vicious statement, which, if widely believed, must undermine the unity of mankind, and so must greatly increase the likelihood of violence and of war.”viii

It is clear that Popper’s “frameworks” are not meant to be schemes of concepts so utterly different as to raise philosophical doubts about translation. He briefly alludes to questions about that, but makes it clear that he is interested in differences in belief systems that can exist between speakers of the same language: “I mean by “framework” here a set of basic assumptions, or fundamental principles –that is to say, an intellectual framework.” It is clear that he assumes that those with different frameworks can share enough concepts to be able to talk to each other: “A discussion between people who share many views is unlikely to be fruitful, even though it may be pleasant; while a discussion between vastly different frameworks can be extremely fruitful, even though it may sometimes be extremely difficult.”ix

Popper’s optimism about dialogue between adherents of different frameworks is heartening at a time when many worry about a “clash of civilizations”. He argued that our Western civilization had its origin in such conflicts between frameworks. The clashes between Greek civilization and Egyptian, Persian, and other civilizations made the Greeks aware of the fallibility of beliefs. Popper thought this led them to give up teaching systems of belief as dogma. Instead, they developed the scientific tradition of exposing beliefs to critical discussion. I do not know how far the evidence supports Popper’s conjecture, but it is encouraging that ideological conflicts can create opportunities as well as dangers.

Dialogue is often possible and it can be fruitful. It becomes easier if, as usually happens, the participants share some platitudes of rationality as framework principles. Adjust your degree of belief in a statement according to the strength of the reasons supporting it. Where there are no good reasons, withhold belief. And so on.

But one of the major obstacles to agreement comes from the holistic nature of belief systems. In defence of the fixed points of their system, people can reject or ignore even the platitudes of rationality. Bertrand Russell had fun with this: “I wish to propose… a doctrine which may… appear wildly paradoxical and subversive… that it is undesirable to believe a proposition when there is no ground whatever for supposing it true. I must of course admit that if such an opinion became common it would completely transform our social life and our political system… I am also aware that it would tend to diminish the incomes of clairvoyants, bookmakers, bishops and others who live on the irrational hopes of those who have done nothing to deserve good fortune here or hereafter.”x

People not only stubbornly cling to beliefs without supporting evidence. They also change the framework principles to make this seem acceptable or even admirable. Some religious believers say, “I believe it on faith”, as if faith were not just belief without evidence but an alternative method of finding things out. The story of Doubting Thomas, rebuked for not having believed in the resurrection until he saw Christ, can be seen as part of the propaganda for the view that it is even better to believe without evidence.

Popper may be right in his optimism about the potential of dialogue. But, because of the role of framework principles in ideological clashes, the dialogue will unavoidably become entangled in philosophy. Which framework principles should be adopted? This question is a large part of the subject matter of epistemology.