7

I'm looking for a phrase or expression to describe a certain kind of cognitive bias. Specifically, I have in mind a situation where someone is convinced that something false is actually true, and this leads them to more and more unreasonable beliefs. E.g., someone who is generally reasonable believes the earth is flat (say a few centuries ago, when direct evidence was harder to come by), and when presented with evidence that strongly suggests otherwise, such as visual evidence that the moon is round or the way ships disappear over the horizon, they create a web of convoluted beliefs (which may continue to grow in order to support itself) to explain these effects because their flat-earth belief is so entrenched it doesn't seriously occur to them that that is false.

The closest related terms I can think of are "blind spot" and "ostrich effect." But these terms don't explicitly refer to the act of making up new narratives/beliefs in order to make observations fit your preconceptions about the universe.

Edit: In response to many of the comments and answers, let me try to clarify. It is certainly similar to this question, but what I am ideally hoping for is a phrase (verb, noun, whatever) to describe a gradual process that snowballs, whereby one false belief leads a reasonable person to construct an increasingly warped view of reality. This could be either something extreme driving the person to a completely insane worldview, or a mild quirk where a person just has a very flawed understanding of specialized issue like what is going on with a plumbing problem in their house and a nearby rascally family of gophers. The person's beliefs could be logically consistent or inconsistent, and the person might come to realize their mistake after seeing direct evidence or just accumulating enough second-hand evidence to reach a tipping point. In particular, what I'd like to describe is certainly a consequence of confirmation bias as suggested in one of the answers, but I'd like to more clearly refer to this specific kind of process in which confirmation bias affects one's thinking beyond the original false belief.

(Sorry I don't a have nice pithy illustrative example.)

9
  • 2
    My understanding of your question is different from that in the six answers I have seen. You say you want it to apply to people in the past, who disbelieved things obvious to us now are not necessarily suffering from delusions or mental disorders of any kind. That being so, I cite the Austrian philosopher of the 1950s, Ludwig Wittgenstein, who, to a colleague saying (one sunset) how obvious it was that they were seeing the rotation of the earth and not the sun sinking, responded: "But what would it look like, if it looked as if the sun were sinking?"
    – Tuffy
    Commented Apr 30, 2020 at 12:05
  • 1
    To clarify, are you looking for a term that describes the phenomenon of inventing new beliefs to explain away evidence which contradicts a strongly-held, established belief? That is, if I believe X, and am presented with evidence Y that contradicts X, I invent a new belief about Y that reinterprets it in some way such that it no longer contradicts X?
    – asgallant
    Commented Apr 30, 2020 at 18:55
  • 4
    Does this answer your question? A word for stretching out the facts just so they fit a theory? Commented May 1, 2020 at 18:29
  • 1
    Based on your edit, it sounds like part of what you're seeking is a phrase to characterize the consequences of an ongoing commitment to a false premise. One might consider these consequences a proliferation of errors. Taking that as a given, one might argue that the process that results in this proliferation of errors is the ongoing rationalization (@JulesCocovin) of the (evolving) circumstances around the matter in question -- whatever ongoing rationalization is necessary to preserve the belief that the false premise is true, unless and until one begins to see the light. Commented May 3, 2020 at 16:57
  • 1
    Something about "houses built on straw" might work but even that doesn't convey the sense of your continuing web… merely the first stage. Could the reason you don't a have nice pithy illustrative example be that broadly, there isn't one? Could that also be why your search engines had nothing useful to suggest? Commented May 3, 2020 at 19:56

16 Answers 16

9

If you filter information so that nothing threatens your view of the world, you may call it 'confirmation bias'. We all do it, actually.

https://en.wikipedia.org/wiki/Confirmation_bias

However, I think more useful would be to explore the following defense mechanisms: rationalisation, wishful thinking, denial. Eg. see here:

https://www.psychologistworld.com/freud/defence-mechanisms-list

EDIT: There's also the concept of 'sunk costs' you might want to work with.

See here: https://en.wikipedia.org/wiki/Sunk_cost

5
  • 1
    Thanks. Yes, this is certainly part of confirmation bias, and this answer comes close to what I'm looking for. But I'd like to more specifically describe the snowballing effect one false belief can have on our belief system. Any further suggestions?
    – Kimball
    Commented May 1, 2020 at 15:00
  • @Kimball I see. You might want to specify what part of speech you're looking for. And, what changes in sb's belief system we are talking about. Is it more like rationalising sth we don't accept, or going totally off the rails mentally, or slowly slipping in madness? I guess another, more contemporary, example would come handy. Commented May 1, 2020 at 15:41
  • Yes, unfortunately I don't have any good examples in mind, at least not one that I can explain easily. I don't care about part of speech, or the degree of the effect. Anyway, I added a paragraph to try to clarify.
    – Kimball
    Commented May 1, 2020 at 22:44
  • @Kimball You have a great variety of possible answers here to choose from :) If I were to describe this process, I'd say that a person is in denial and getting more and more delusional lexico.com/definition/delusional (Weather Vane got that idea first). Commented May 2, 2020 at 17:50
  • @Kimball modern day flat earthers, deniers of anthropogenic climate change, anti-vaxxers all seem to me to be examples of the phenomenon you are trying to encapsulate.
    – asgallant
    Commented May 4, 2020 at 19:23
8

Building a house upon sand, may be the expression you need - a biblical metaphor, from one of Christ's parables, which has stood the test of time.

7

One word to describe it, from Lexico is a

delusion
NOUN

1 An idiosyncratic belief or impression maintained despite being contradicted by reality or rational argument, typically as a symptom of mental disorder.
Is this for real, or just a delusion on my part?

1.1 MASS NOUN
The action of deluding or the state of being deluded.
The rest of us play along, but no one is fooled by this necessary delusion.

2
  • "explicitly refer to the act of making up new narratives/beliefs in order to make observations fit your preconceptions about the universe." : delusion, +1. Possibly, of grandeur, especially if they pontificate, otherwise it's self-delusion.
    – Mazura
    Commented May 1, 2020 at 0:25
  • “That's what I said!” – M&TV.SE
    – Mazura
    Commented May 1, 2020 at 0:38
5

Simply "false premise".

Similar to @ZachP's Latin phrase a falsis principiis proficisci.

Update (for all the down-voters who didn't care to comment):

A "false premise" is an incorrect proposition that forms the basis of an argument. Since the premise is not correct, the conclusion drawn may be in error. However, the logical validity of an argument is a function of its internal consistency, not the truth value of its premises.

False premises are the underpinning of cognitive biases; although, not all false premises are used to construct a cognitive bias. As an example in the OP's context, "The group's choice to continually cling to this false premise led to the acceptance of an entire web of falsehoods."

4

https://en.wikipedia.org/wiki/Belief_perseverance

Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect).[3] For example, journalist Cari Romm, in a 2014 article in The Atlantic, describes a study in which a group of people, concerned of the side effects of flu shots, became less willing to receive them after being told that the vaccination was entirely safe.[4]

Since rationality involves conceptual flexibility,[5][6] belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature".[7]


Edit to Add:

https://www.scienceofrunning.com/2015/12/why-you-should-change-your-mind-power.html?v=47e5dceea252

In his new book Black Box Thinking, Matthew Syed outlines the extremes people go to avoiding adjusting their model. From doctors to lawyers to academics, we’re all guilty of cognitive dissonance:

“when we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. we simply invent new reasons, new justifications, new explanations. sometimes we ignore the evidence altogether.”

1
  • I don't think one can be 'guilty' of cognitive dissonance. C-D is a mental state that we may try to resolve in the ways you describe, but in itself it is not a fault.
    – Jim Mack
    Commented Apr 30, 2020 at 20:08
4

Perhaps what you're looking for is the latin phrase a falsis principiis proficisci meaning to proceed from false principles. This would apply when the conclusions are valid (even if factually wrong) given that the initial premises is correct.

https://simple.wikipedia.org/wiki/Argument_from_false_premises

Edit: Given OP's clarification desiring a description of the process, rather than the condition itself, I believe "Going down the rabbit hole" fits best. For example:

Once John hit upon the idea that mice were spontaneously created by piles of grain, he really went down the rabbit hole concocting all manner of theories.

3

The phrase garbage in, garbage out—often abbreviated to GIGO—expresses the idea that if you start with bad inputs to your system, you're not going to get good outputs. While the phrase came from computer science, the above link notes that

The principle also applies more generally to all analysis and logic, in that arguments are unsound if their premises are flawed.

3

Something that comes close is a House of cards.

You could say your person's belief structure is built precarioulsy, and will come tumbling down if the one weak underpinning belief is removed.

2

Given that, in this particular case, the person in question is adding increasingly convoluted modifications to an initially-simple, but wrong, model, in an effort to make it agree with evidence just as well as a simple, correct model, you might say that person is discarding Occam's razor.

After all, if they were eventually able to create a flat-earth theory that did explain all of our experimental data, it would be so convoluted compared to actual physics that anyone who believed that Occam's razor was a good rule of thumb would immediately discount the extremely complicated flat-earth model. The fact that this person would stick with their increasingly-complicated model in an effort to discount a simpler one suggests that they don't believe in the utility of Occam's razor.

1

This could be either a False Cause or Appeal to Tradition Fallacy

False Cause: This fallacy establishes a cause/effect relationship that does not exist. There are various Latin names for various analyses of the fallacy. The two most common include these types:

(1) Non Causa Pro Causa (Literally, "Not the cause for a cause"): A general, catch-all category for mistaking a false cause of an event for the real cause.

(2) Post Hoc, Ergo Propter Hoc (Literally: "After this, therefore because of this"): This type of false cause occurs when the writer mistakenly assumes that, because the first event preceded the second event, it must mean the first event caused the later one. Sometimes it does, but sometimes it doesn't. It is the honest writer's job to establish clearly that connection rather than merely assert it exists. Example: "A black cat crossed my path at noon. An hour later, my mother had a heart-attack. Because the first event occurred earlier, it must have caused the bad luck later." This is how superstitions begin.

The most common examples are arguments that viewing a particular movie or show, or listening to a particular type of music “caused” the listener to perform an antisocial act--to snort coke, shoot classmates, or take up a life of crime. These may be potential suspects for the cause, but the mere fact that an individual did these acts and subsequently behaved in a certain way does not yet conclusively rule out other causes. Perhaps the listener had an abusive home-life or school-life, suffered from a chemical imbalance leading to depression and paranoia, or made a bad choice in his companions. Other potential causes must be examined before asserting that only one event or circumstance alone earlier in time caused a event or behavior later. For more information, see correlation and causation.

or

Appeal to Tradition (Argumentum Ad Traditionem; aka Argumentum Ad Antiquitatem): This line of thought asserts that a premise must be true because people have always believed it or done it. For example, "We know the earth is flat because generations have thought that for centuries!" Alternatively, the appeal to tradition might conclude that the premise has always worked in the past and will thus always work in the future: “Jefferson City has kept its urban growth boundary at six miles for the past thirty years. That has been good enough for thirty years, so why should we change it now? If it ain’t broke, don’t fix it.” Such an argument is appealing in that it seems to be common sense, but it ignores important questions. Might an alternative policy work even better than the old one? Are there drawbacks to that long-standing policy? Are circumstances changing from the way they were thirty years ago? Has new evidence emerged that might throw that long-standing policy into doubt?

https://web.cn.edu/kwheeler/fallacies_list.html

1

Principle of explosion, or from contradiction, anything, or various Latin equivalents refer to this same idea in classical and related logics where very literally anything follows from a contradiction.

2
1

I'm not sure you need anything more than 'belief-based'. If faith provides the axiomatic base, all that builds on it is - to say the least - unproved.

1

Perhaps denier captures what you're seeking. From Cambridge:

denier: a person who says that something did not happen or that a situation does not exist, especially something that most people agree did happen or does exist: He has compared climate change deniers to people who denied the link between HIV and Aids.

A denier has an obligation to explain their denial, and that's what gives rise to the web of false beliefs/assertions -- each new belief/assertion collides with the ever growing body of data/facts.

For example, climate change deniers must resort to all manner of explanations for what's happening with our climate that fly in the face of massive amounts of scientific evidence and an overwhelming consensus in the scientific community. The more evidence, the harder their job becomes and the larger the web they must weave.

One could say the same for those who deny the evolution of species by natural selection or believe the earth is 6000 years old.

Your example of flat-earthers is yet another example.

Whenever human knowledge advances, there will probably be deniers all too ready to resist those advances by whatever mental gymnastics necessary: the more evidence, the more gymnastics required. Alas, human, all too human.

0

What about the good old superstition?

superstition

An irrational belief that an object, action, or circumstance not logically related to a course of events influences its outcome. (American Heritage)

Clearly, a superstition would lead to trains of irrational reasoning and incorrect conclusions.

Believing in flat earth today, with evidence and reasoning tools accessible even to schoolchildren, would qualify as superstition.

0

Great question.

There really should be a word or short expression for this. It could cover many schools of belief in human history and across cultures.

What about "argument from extrapolation"? The more you choose to extrapolate from your initial (incorrect) premise, the more convinced you become of the truth of that premise... and therefore of the structure you are extrapolating from it.

But you ask for an expression for the process itself, rather than the mental process which causes it, and particularly in the face of increasing counter-evidence. What about "rampant extrapolatory heuristics", or even "extrapolatory panic"?

-1

The closest I can think of is inspired by the masterpiece that is The Matrix...

"red-pilled" or "down the rabbit hole" ..

Morpheus: “You take the blue pill, the story ends, you wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in wonderland, and I show you how deep the rabbit hole goes.” --- The Matrix

https://www.vulture.com/2019/02/the-matrix-red-pill-internet-delusional-drug.html

2
  • 1
    You have this backwards. The red-pill is reality and the blue-pill is the Matrix simulation.
    – ohmu
    Commented Apr 30, 2020 at 14:04
  • 1
    Red Pill is associated with a specific Alt-Right subgroup. But "going down the rabbit hole" is a common expression and seems pretty close. It's an Alice in Wonderland reference. Commented Apr 30, 2020 at 15:09

Not the answer you're looking for? Browse other questions tagged or ask your own question.