What's new

The Easiest Person to Fool

mockingbird

Silver Meritorious Patron
"The first principle is that you must not fool yourself, and you are the easiest person to fool."
from lecture "What is and What Should be the Role of Scientific Culture in Modern Society", given at the Galileo Symposium in Italy (1964) Richard Feynman American theoretical physicist

“There are two ways to be fooled. One is to believe what isn't true; the other is to refuse to believe what is true.”
― Soren Kierkegaard

I was in Scientology for twenty five years and left about five years ago. In the process of leaving Scientology I discovered that virtually every important idea I had come to accept and believe from Scientology was false. EVERY.

Now, if you have never been in a group like Scientology that might not explain to you how extreme a change that is. It just so happens that a group like Scientology in the recruitment and indoctrination process undermines and destabilizes your earlier beliefs and identity to have it be rejected and devalued so a new set of beliefs and values and even a new identity can replace it. This new identity is a sort of copy or clone of the leader or founder in a cult and in Scientology it was Ron Hubbard. The process has greatly varying degrees of success and is never exactly the same.

In my case it was successful enough that in discovering that Hubbard was a pathological liar and Scientology itself a harmful fraud I ended up with very little left I believed in. Scientology serves as religion, philosophy, science, and many other subjects to those who embrace it wholeheartedly as I did. Suffice to say there was very little it didn't cover.

So, when I rejected it and realized just how wrong I was it left me with the obvious question - how could I or anyone be so thoroughly fooled ? Especially regarding something I thought I had carefully and thoroughly looked at for literally thousands of hours over decades ?

Kind of an important question. So, among other things I tried to learn about and understand in the years since I left Scientology I looked for knowledge about how we come to believe things, how we can make errors in our reasoning that makes it possible for us to be "oh so wrong" but very certain we are "oh so right." I tried to learn what I did that contributed to my ending up "about as wrong as a person can be" for decades and steadfast in my confidence that I was "without a doubt" right.

I want to look at how any of us can be on the wrong side of an issue, believe in something that we shouldn't and how we can keep on believing in something when we really should be using better reason.

I have examined things like logical fallacies and cognitive biases in the past and recommend everyone dig into those. I also have looked at rhetoric and propaganda techniques and feel the same way about them. They are essential to understand influence and human behavior.

Here I want to focus on some specific ideas and look at how they help to get and keep us on the wrong track, how they get and keep us using poor reason when better thinking is possible and needed. If I had thoroughly understood these ideas and many others I learned since leaving Scientology I am confident that I wouldn't have been a candidate for recruitment by Scientology as it would have been obvious to me it was not anything like the group it claims to be.


Scientology really encourages adopting certain ways of thinking, feeling and behaving. It encourages a mindset and attitude that is not conducive to independent and critical thinking.



The flaws that Scientology lays down as bad habits for members serve as examples of how these same habits are equally bad reason when used in other contexts. What makes Scientologists into deluded dupes serves to make people in other situations into people using poor judgement and reason and frankly open to creating, holding and defending and retaining ideas that are not well thought out. Sometimes the ideas will be correct and sometimes they will be wrong but the reason behind them will be poor and the confidence for them may be high. Not a good combination.


What exactly am I talking about ? Habits in thought, speech and behavior.

Let me start with some examples.

I found a great list at Philosophy Courses Website: Professor Matt McCormick‎ > ‎
Biases, Fallacies, and Errors in Reasoning

"Going Nuclear is the mistake of artificially elevating the standards of proof against an opposing position and arguing that it is unjustified unless it is known with absolute, deductive, or excessive levels of certainty, particularly when one cannot meet those standards oneself, or those standards are not met for many other claims that we take to be justified. A Congressman is skeptical about the evidence for global warming. He hears many highly qualified scientists review the evidence. They explain that there is a widespread consensus among the best experts in the scientific field that global warming is happening. But the Congressman resists, saying, “Well, perhaps, but in the end is it really possible to prove it? I mean, can you prove with absolute certainty that it is happening? Especially when you might possibly be wrong?” Meanwhile, the Congressman’s doctor tells him that the test results indicate a good chance that he has a bacterial infection, so the Congressman agrees to take antibiotics, even though the evidence for the diagnosis is preliminary. Specifically, the Going Nuclear mistake is elevating the standards of proof to a level that cannot be satisfied by any amount of evidence. It invokes global or extreme skepticism in order to refuse a conclusion. Motivated Reasoning, as we have explained in here, is applying different standards of proof to evidence that is for or against a view you already hold. "

This is one of my favorites. It is easy to see. No evidence is good enough for a position you oppose. Video, studies by experts, scientific consensus, eyewitnesses, mountains of physical evidence, none of it matters.

Someone says a politician you support is a sex criminal. Numerous women making accusations, video and audio tape of the politician bragging about being a sexual predator and assaulting women, numerous supporting witnesses and incidents of infidelity and affairs and hundreds of incidents of insulting women do not matter.

Scientology encourages, even demands it via many methods, you either see Scientology doctrine as infallible and sacred and Scientology founder Ron Hubbard as impeccable in character and accomplishment or there is no place for you in Scientology.

When someone is online and they say "prove it to me" regarding an idea that they are adamantly rejecting it is an example of going nuclear. If they won't examine or consider any evidence or arguments or ideas that could disagree with the idea they are defending then there is no proving anything to them, except what they already know. If they won't listen and know they don't need to read anything that disagrees with them because it is wrong because it disagrees so they have already closed their mind, whether they know it or not.

Another quote from Matt McCormick is useful:

"Non-disconfirmable hypotheses: A thinker is advocating a non-disconfirmable hypothesis when there are no circumstances, no evidence, no arguments, or no scenarios, even hypothetically, under which she would acknowledge that it has been disconfirmed or that it is false." End quote

You need to recognize this. Sometimes you can say "what would convince you your idea is wrong" and if they say "nothing" then you know what is going on. Sometimes they will act like a standard that is not always right is the only standard.

They might act like you need a behavior or admission from someone that is almost impossible for them to consider something that is quite possible without that exact and difficult to get proof.

We all can be guilty of this. Scientology has no monopoly on poor reason.

But it is hard to realize that I am close minded but pretending to be reasonable and open minded. Scientology successfully instilled that attitude and approach in me and a series of fortunate events over decades led to me being incrementally freed up enough to reconsider the underlying assumptions I had been following routinely.

( I described the whole process in extreme detail in the blog post
Getting Into and Getting Out of Scientology - The Lies That Bind also available at Mockingbird's Nest blog on Scientology)

It is a great exercise to think of times that real people have "gone nuclear." They can be famous or not, just as long as they actually did this. They can be any political type or any religion or non-religous, they can be smart or dumb even rude or polite and they can be right or wrong. The approach and mindset is the key, not anything else. I have found with trying to understand fallacies and biases that finding any examples is the best way to start and then if you find people who you disagree with easily, as most people do when thinking of errors in reason, moving to people you agree with is harder and finding times I myself used the error are hardest to find but crucial for dealing with it.

You cannot deal with errors in reason by just acknowledging them in other people. Everyone can see it as the other person's problem and then no one will do anything but finger pointing.

Scientology gave me the gift of realizing that I had made tremendous, unimaginable, errors in reasoning and had done it for decades and had supreme confidence I was totally right.

Many people never get to realize they followed a false prophet and a fraudulent prophecy. I did.

I want to stick with the errors and expand on them.

More from Matt McCormick:

"Motivated Reasoning: People criticize preference inconsistent information with excessive skepticism, while lowering those critical standards for information that would corroborate favored beliefs. That is, they are more critical and demand a higher level of evidence concerning conclusions that conflict with things they already believe, and they are less thoughtful or skeptical when evidence supports their favored views. A prior held belief steers the search for and analysis of information rather than an unbiased gathering and evaluation of evidence leading to the most reasonable conclusion. Motivated reasoning may or may not commit confirmation bias as well. Motivated reasoning is reasoning that is directed at achieving a particular conclusion, no matter what the truth or the evidence indicates. Confirmation bias is a particular kind of filtering of the evidence that leads to a conclusion."

"Confirmation Bias is the mistake of looking for evidence that confirms a favored conclusion while neglecting or ignoring evidence that would disprove it. Susan reads her horoscope and it tells her that Virgos are outgoing. She believes in astrology, and she searches her memory for times when she’s been outgoing. She thinks of a few cases where it seemed to be accurate and concludes that astrology works. Juan thinks that he has prescient dreams, or dreams that tell the future. Out of the thousands and thousands of dreams he’s had over the years, he easily recalls the one or two times that he dreamt about some event and then the next day it seemed to happen. He fails to notice that the vast majority of his dreams did not work out like this. Those thousands of other dreams are easily forgotten and neglected."


"The Umpire Effect is a particular form of motivated reasoning. Enthusiastic fans of sports teams tend to find fault in referee calls that go against their team and accuse the umpire of bias, but they will be more satisfied with referee calls that favor their team. Conservatives often argue that the news media is too liberal, while liberals insist that the news media is too conservative."



"The Sliding Scale Fallacy: Motivated reasoners will apply a more skeptical, critical set of standards against the evidence for views that they oppose, while letting mistakes, sloppiness, or a failure to meet those same standards go in cases where evidence or arguments are being presented in for of conclusions they favor. See Motivated reasoning."

The last few items correspond well to cognitive dissonance theory. Cognitive dissonance theory is a subject in its own right, it is far more than a definition or paragraph or even a book could fully describe. I recommend the book A Theory of Cognitive Dissonance by Leon Festinger for anyone who really wants to understand human thought and behavior. I also wrote the series on cognitive dissonance theory here at Mockingbird's Nest blog on Scientology entitled
Scientology And Cognitive Dissonance Theory.

Motivated reasoning is a huge factor in being biased, applying unfair standards to evidence and thinking that you are being reasonable and logical.

How many times have we accepted as true an article or meme that agrees with what we already believe or want to believe ? And how many times have we doubted or outright dismissed the articles and memes we find that disagree with our beliefs or preferences ?

It is human nature. It takes a lot of hard work, practice and persistence to even spot this in ourselves a little bit.

We usually are carried along following deeply held unexamined assumptions and seeking to feel good or at least content as we find evidence that our beliefs are correct and we often are uncomfortable, confused or disoriented when we find evidence against our beliefs and preferences. It is like being a ball that bounces along and is pleased by and comfortable bouncing on things that agree with our beliefs and desires and being upset and ill at ease with things that disagree with our beliefs and desires, so we try to desperately to steer ourselves to land on the desirable things.

I have seen the sliding scale fallacy a lot, even in people who are not Scientologists and people who have studied critical thinking. Sometimes people who have studied science make the error of thinking they have transcended these errors. At best, at the very best, we might just might able to spot them in ourselves some of the time, not even most of the time, and just might be able to slightly reduce them in ourselves. That might not sound like it is worthwhile but think about this: if you reduce your error rate and it stops you from committing one really huge error - or makes it so you can spot it and change what you are doing - that can save you a lot of trouble. It can save a marriage, a job, a friendship and more. It can save a life.

With the sliding scale fallacy I have a lot of examples, a lot. One that keeps coming up is worth describing. Scientology has a tremendous amount of doctrine, perhaps tens of millions of words on tapes and written in numerous orders, policies, bulletins and books.

In that Hubbard wrote hundreds and hundreds of statements on hypnosis. He studied hypnosis for decades and numerous sources outside of Scientology have confirmed this. Many of his contemporaries who were not Scientologists support this claim.

Scientology has extensive claims on not using hypnosis in any way. Almost every Scientologist will say this. Okay.

In leaving Scientology I did several things. I looked at the article Never Believe a Hypnotist by Jon Atack and his Scientology Mythbusting articles that are all available free at The Underground Bunker blog on Scientology.

I read books on hypnosis such as Trances People Live and Hypnotism Comes of Age, I watched numerous videos by hypnotists and read many articles.


I gathered and presented evidence that Hubbard tried to covertly hypnotize people first through Dianetics and later with Scientology.

I won't present it all here now as it includes dozens of quotes by Hubbard to demonstrate his knowledge of hypnosis and in particular his knowledge of how he was using it in Scientology and his intent to use it to covertly enslave people.

I presented definitions and explanations and examples from hypnosis to untangle what hypnosis is, the basic techniques that it is built on and the effects and phenomena these techniques create, so people, especially Scientologists and ex Scientologists, could know that the effects and phenomena Hubbard described in one way when discussing hypnosis were the same things he described in a completely different way when they are created in Scientology indoctrination and another in Scientology auditing and another in Scientology ethics technology and another in Scientology administrative technology.

I also collected extensive evidence from studies and experiments in psychology and neuroscience to give scientific support to the claims that particular techniques like confusion via contradiction, attention fixation, vivid imagery, repetition, mimicry and repetitive leading questions can influence people.

Imagine a blackboard with a line down the middle. On one side is the evidence I gathered and presented. It is grouped into broad categories and the ideas are summed up into little phrases.

On the side for my claim is a very, very full page with lots and lots of evidence for each of my main ideas.

On the other side ? Sometimes I have presented some of my evidence to a Scientologist or ex Scientologist and they present one or two statements like "I just can't believe it" or "but the indoctrination technology was from someone else originally !"

A couple things, just because we cannot believe something doesn't make it false. Reality doesn't care about our opinion. Lots of unbelievable things have turned out to be true.

Scientology has hundreds of contradictory statements on hypnosis so most people just give up and accept that Hubbard said Scientology is not hypnosis in some of them.

Scientology study technology has been described as being taken in part from a couple who came up with some of the concepts in Scientology Study Technology indoctrination procedures.

Here is a quote from The Underground Bunker blog by Tony Ortega:

More proof that Scientology used the ‘R2-45’ method to intimidate enemies
By Tony Ortega | March 17, 2015

"Charles Berner had been one of L. Ron Hubbard’s most enthusiastic early followers. From 1954 to 1957, he was even president of the Church of Scientology of California, the “mother church” of the organization. According to some oldtimers, Berner and his wife, Ava, were responsible for coming up with the concepts of “Study Tech” which Hubbard later co-opted as his own. By 1965, Hubbard “declared” Berner, and made him the church’s first “enemy number one.” End quote

Now, Chris Shelton was able to conduct an interview with Ava Berner on his YouTube channel entitled The Basics of Scientology: Study Tech. I do not dispute the claim that Hubbard stole some ideas from this couple. I also do not dispute the claim that they sincerely thought they had found some beneficial ideas.

Hubbard took a handful of ideas that others found believable and altered both the ideas and the techniques that these ideas are implemented with to such an extreme degree as to fundamentally change what is being done with them. And Hubbard in several references I quoted laid out how this could be done but he needed the crucial elements of a believable explanation for the indoctrination methods he came up with to fool people.

So, someone could put the story of the Berners on the other side if they felt it was evidence against my claims. Fine.

Here is where we really get to see motivated reasoning and the sliding scale fallacy. Imagine you are an ex Scientologist and you have rejected Scientology, you think...but oddly still are firm in your conviction that Scientology indoctrination is not based on covert hypnosis because you just KNOW it cannot be, despite admitting you know nothing about hypnosis...except Hubbard told you Scientology isn't hypnosis...

You have one anecdote, the taking of some basic terms and a handful of ideas from the Berners but then Hubbard by all accounts dramatically changed many aspects of study technology and added many techniques, ideas and drills and procedures resulting in something far more than what the Berners brought him, something far different.

Isn't the fact that Hubbard ended up with something far different enough to consider that he made it into something that does something else ?

I propose that the ex Scientologist in this position should carefully examine my claims and evidence, but have found all too often that some will not. Their personal incredulity (a logical fallacy, believing something cannot be because you find it hard to believe) and a handy way to PARTIALLY explain the origin of Scientology indoctrination methods is more than enough for some ex Scientologists to reject any evidence and claims WITHOUT EXAMINATION OR CONSIDERATION. Some people I will admit do examine my claims and evidence and find them persuasive. Some examine them and are shocked. Some remain skeptical. But if you do not at least examine them, especially if you were in Scientology, how good is your reasoning on this matter ?

Let me be even more clear. As we receive information we are prone to have feelings in response to the information. If the information contradicts our beliefs and behaviors, especially our deeply held values, we can experience confusion, disorientation, anxiety and discomfort, perhaps even frustration, impatience and anger.

So, in an unthinking effort to alleviate these unpleasant sensations we can reject the information, we can discredit the source and double down on our own beliefs and use all the logical fallacies I posted here to see ourselves as right and the information we disagree with as wrong due to an emotional motivation. It is a motivation to escape escape unpleasant emotion and return to more pleasant emotions.

This is entirely consistent with cognitive dissonance theory.

In A Theory of Cognitive Dissonance Leon Festinger gave some relevant ideas:

When there is a clear and unequivocal reality corresponding to some cognitive element, the possibilities of change are almost nil. (Page 27)

It would still be possible to reduce the dissonance by what also amounts to adding a new cognitive element, but of a different kind. He can admit to himself, and to others, that he was wrong. (Page 29)
Sometimes, however, the resistances against this are quite strong.(Page 29)

A person would expose himself to sources of information which he expected would increase consonance but would certainly avoid sources which would increase dissonance.
(Page 30)
This just seems like human nature. It is a terrible bias toward information that fits what you want to find, with avoidance of contrary evidence. This is terrible for objective analysis or critical thinking or scientific method. It is also bad for relationships. It exists to greatly varying degrees in people and even varies regarding different subjects within an individual.
So in a person considered reasonable and level headed this may be less than in a black and white thinker who is polarized, with unshakeable certainty on everything with no doubt or personal reflection.
Unfortunately, Scientology makes many members into extremely close minded and blindly obedient mental slaves. In this abusive relationship the victim tries to pretend the abuse and anything that could expose it are not true.
The operation of a fear of dissonance may also lead to a reluctance to commit oneself behaviorally. (Page 30)
Hence, it is possible for dissonances to arise and to mount in intensity. A fear of dissonance would lead to a reluctance to take action-a reluctance to commit oneself. (Page 31)

There are several areas of opinion where it is notoriously difficult to change people. (Page 120)

Now an important thing to be aware of is the tendency people have to try to avoid or minimize dissonance most of the time.
One might also expect, however, that at the initial moment of impact of the new dissonant cognition, effective processes could be initiated which would prevent the dissonant elements from ever being firmly established cognitively. One might expect to observe such things as attempts to escape or avoid further exposure, erroneous interpretation or perception of the material, or any other technique or maneuver which will help to abolish the newly introduced dissonance and to prevent the further introduction of dissonance. (Page 134)
That is a way to say that if you realize or suspect information will be against your beliefs automatic responses are switched on that can counterargue against the information. Counterarguing is thinking of claims against the information or reasons to not accept it or using fallacies to avoid the information.
There are other psychological defense mechanisms that are triggered including denial that all can have dissonance inspiring information set as triggers.
And emotional reactions that prevent even accepting the information. This is the anatomy of being close minded and stubborn.
Festinger describes several processes including intentional misunderstanding, thereby avoiding the dissonance, this can occur if the message is open to multiple interpretations or vague.
Also if the message is clear and not capable of alternative conclusions then other methods are utilized. A person may accept a message on the surface but see exceptions or that a particular example is true but that the general principle in question is not.
This is strikingly similar to Hubbard's claim that "suppressive generalities" exist. Many Scientologists and exes embrace this technique to reject without analysis virtually any concepts they wish to avoid.
Festinger quotes the conclusions of others regarding a study.
"...the prejudiced person's perception is so colored by his prejudices that issues presented in a frame of reference different from his own are transformed so as to become compatible with his own views. Quite unaware of the violation of facts he commits, he imposes on the propaganda item his own frame of reference." (Page 136)

Festinger goes on:
Denial of Reality
It sometimes happens that a large group of people is able to maintain an opinion or belief even in the face of continual definitive evidence to the contrary. Such instances may range all the way from rather inconsequential occurrences of short duration to phenomena which may almost be termed mass delusions. (Page 198)
Let us imagine a person who has some cognition which is both highly important to him and highly resistant to change. This might be a belief system which pervades an appreciable part of his life and which is so consonant with many other cognitions that changing the belief system would introduce enormous dissonance. (Page 198)
These two ideas are extremely relevant to Scientology. They almost cannot be stressed enough. Regarding maintaining mass delusions an entire series of books could be written detailing the delusional belief system Scientology requires continuously and the encyclopedias Hubbard assembled detailing how to install and maintain these delusions to enslave his victims mentally.

For my example I want to focus on two of the many things Festinger wrote:

When there is a clear and unequivocal reality corresponding to some cognitive element, the possibilities of change are almost nil. (Page 27)

Sometimes, however, the resistances against this are quite strong.(Page 29)

In my example an ex Scientologist may say "well, Hubbard stole study tech from the Berners, that is undisputed, so case closed."

But what if instead of using good critical thinking to carefully examines the relevant evidence for the claims of both sides the ex Scientologist was in fact using motivated reasoning, and the rest of the fallacy parade to avoid unpleasant emotions and return as quickly as possible to pleasant ones ?

Not every person who examines my claims and evidence will agree with me. But there is a world of difference between making a good effort to really understand the case for and against ideas and to try to as objectively as possible consider and carefully weigh the facts.

Scientology sets people up to habitually think in fallacies, to have deeply held emotionally charged assumptions and to not examine those assumptions and instead to fiercely defend them. Scientology uses emotions to make doubting ideas from Scientology incredibly uncomfortable and to make hanging onto those ideas comfortable. Those ideas usually include believing study tech and the concepts, phenomena, interpretation of phenomena and techniques are valid and true and that any other view, including the idea that Hubbard used hypnosis and covert persuasion in Scientology indoctrination, is inconceivable and ludicrous.

But just because Hubbard put a lot of work into getting people to believe something doesn't make it true, even if it feels like it must be true.

I have written a lot on the evidence for Hubbard having tried to covertly hypnotize people through study technology and Scientology auditing.

Here are a couple quotes I found relatively recently:
"If you can produce enough chaos — it says in a textbook on this subject — if you can produce enough chaos you can assume the total management of a psyche — if you can produce enough chaos.
The way you hypnotize people is to misalign them in their own control and realign them under your control, which necessitates a certain amount of chaos, don’t you see?
Now, the way to win through all of this is simply to let the guy have his stable data, if they are stable data and if they aren’t, let him have some more that are stable data and he’ll win and you’ll win.
In other words, you can take any sphere — any sphere which is relatively chaotic and throw almost any stable datum into it with enough of a statement and you will get an alignment of data on that stable datum. You see this clearly?
The whole society is liable to seize upon some stupid stable datum and thereafter this becomes a custom of some sort and you have the whole field of morals and mores and so forth stretching out before your view."

Hubbard, L. R. (1955, 23 August). Axiom 53: The Axiom Of The Stable Datum. Academy Lecture Series/Conquest of Chaos, (CofC-2). Lecture conducted from Washington, DC.


"Another way to hypnotize somebody would be to put him in the middle of chaos, everything going in all directions, everybody shooting at him and suddenly throw him a stable datum, and make it a successful stable datum so that it’s all called off once — the moment he grabs this. And this gives you the entire formula of brainwashing: interrogate, question, lights, pain, upset, accusation, duress, fear, privation and we throw him the stable datum. We say, “If you’ll just adopt ‘Ughism’ which is the most wonderful thing in the world, all this will cease,” and finally the fellow says, “All right, I’m an ‘Ugh.’ ” Immediately you stop torturing him and pat him on the head and he’s all set.Ever after he would believe that the moment he deserted “Ughism,” he would be drowned in chaos and that “Ughism” alone was the thing which kept the world stable; and he would sell his life or his grandmother to keep “Ughism” going. And there we have to do with the whole subject of loyalty, except — except that we haven’t dealt with loyalty at all on an analytical level but the whole subject of loyalty is a reactive subject we have dealt with. "


Author: Hubbard, L. R.
Document date: 1955, 21 September, 1955, 21 September
Document title: Postulates 1,2,3,4 In Processing - New Understanding of Axiom 36, Postulates 1,2,3,4 In Processing - New Understanding of Axiom 36

In the blog posts 1) Insidious Enslavement: Study Technology and 2) Basic Introduction to Hypnosis in Scientology and 3) Burning Down Hell - How Commands Are Hidden, Varied And Repeated In Scientology To Control You As Hypnotic Implants and 4) Why Hubbard Never Claimed OT Feats And The Rock Bottom Basis Of Scientology I took on laying out the foundation of Scientology relying upon cognitive dissonance and hypnosis.
I elaborated on the book A Theory Of Cognitive Dissonance by Leon Festinger in a series of blog posts. I combined all eleven of them together in one long post Scientology And Cognitive Dissonance Theory.
The point of this post is not for EVERYONE to read everything I put out and have identical beliefs to my own. Or even to just read everything I put out. Taking on that much is really just for the most serious students regarding cults, Scientology and persuasion and people with a strong personal stake in understanding or recovering from Scientology.

The point is is that the same human nature - including psychology with cognitive dissonance and biases and logical fallacies - what makes being and staying duped by Scientology possible is present in ALL of us. We all can be fooled and manipulated. And we all have the potential to STAY wrong if we are not meticulously scrutinous regarding the things we don't want to examine, the things that clash with our worldview, that contradict out most cherished beliefs. We should look at our unexamined assumptions and see are they valid ? Are they well supported by facts and solid evidence ? What is the best evidence and the best arguments against them ? Is there new information we have not considered that is relevant to them ?

I am going to be frank. Critical thinking is HARD. It goes against many habits we usually have as first nature. Leonard Mlodinow wrote in the book Subliminal on how our minds are very good as lawyers, selecting and manipulating information for our benefit and very poor as scientists, carefully using scientific standards and gathering evidence objectively both for and against out desires and letting the quest for knowledge and truth guide us. Jonathan Haidt in his book The Righteous Mind uses a metaphor of the intellect as a rider and emotion as an elephant. Imagine if you as a man or woman try to get an elephant you are riding to go somewhere. Imagine if the elephant is determined to go towards something.

I have to agree with the critical thinking expert Richard Paul who wrote Critical Thinking with his wife Linda Elder, critical thinking is a way of doing things, an approach that takes a lot of work, it is something you have to carefully keep working at and trying to return to and improve upon. You have to challenge yourself and keep getting more and more knowledge on multiple subjects related to critical thinking or you are not really doing it.

I highly recommend the book Critical Thinking and the YouTube videos by Richard Paul. I wrote the blog post
Cornerstones of Critical Thinking 1 - 8 Introduction to Critical Thinking here at Mockingbird's Nest.

Scientology gives many of us a gift when we reject it. We know we are gullible, we know we can be wrong, protection of that knowledge takes work. It takes learning about the ways we can be wrong and the ways we fool ourselves and keep ourselves fooled. Probably nothing short of the miraculous can remove all of the flaws and errors in human thought, to be human is to err as Alexander Pope wrote. Scientology in many ways is a a kind of warped mirror. It shows aspects of life that exist but has some exaggerated and turned way up and others are almost absent. It is a surreal experience to be in Scientology and a surreal experience to honestly examine it in depth.


I hope that in examining Scientology for whatever reasons we have, no matter what background we start at, we can understand ourselves better and understand each other better.


I tried to pick the particular errors in reason I did to start this with because they are ones most likely to generate resistance to even considering that you or I can be wrong in ourselves.
 
Last edited:

PirateAndBum

Gold Meritorious Patron
I'd break up these initial "wall of words" posts into more bite-sized chunks. Your posts are interesting but I usually don't make it all the way through. tl;dr
Maybe it has to do with the repetition and lack of conciseness.
 

strativarius

Inveterate gnashnab & snoutband
I'd break up these initial "wall of words" posts into more bite-sized chunks. Your posts are interesting but I usually don't make it all the way through. tl;dr
Maybe it has to do with the repetition and lack of conciseness.
Yes, but there's been a great improvement actually. There was a time when there were absolutely no paragraphs at all, just one solid block of hundreds of lines of text. All this doesn't belong here really, he should just give some idea of the subject matter and then give a link to the specific text on his blog he would like to be read as per 'Board Rules [6]'.
 

Bill

Gold Meritorious Patron
"The first principle is that you must not fool yourself, and you are the easiest person to fool."
from lecture "What is and What Should be the Role of Scientific Culture in Modern Society", given at the Galileo Symposium in Italy (1964) Richard Feynman American theoretical physicist

“There are two ways to be fooled. One is to believe what isn't true; the other is to refuse to believe what is true.”
― Soren Kierkegaard

I was in Scientology for twenty five years and left about five years ago. In the process of leaving Scientology I discovered that virtually every important idea I had come to accept and believe from Scientology was false. EVERY.

Now, if you have never been in a group like Scientology that might not explain to you how extreme a change that is. It just so happens that a group like Scientology in the recruitment and indoctrination process undermines and destabilizes your earlier beliefs and identity to have it be rejected and devalued so a new set of beliefs and values and even a new identity can replace it. This new identity is a sort of copy or clone of the leader or founder in a cult and in Scientology it was Ron Hubbard. The process has greatly varying degrees of success and is never exactly the same.

In my case it was successful enough that in discovering that Hubbard was a pathological liar and Scientology itself a harmful fraud I ended up with very little left I believed in. Scientology serves as religion, philosophy, science, and many other subjects to those who embrace it wholeheartedly as I did. Suffice to say there was very little it didn't cover.

So, when I rejected it and realized just how wrong I was it left me with the obvious question - how could I or anyone be so thoroughly fooled ? Especially regarding something I thought I had carefully and thoroughly looked at for literally thousands of hours over decades ?

Kind of an important question. So, among other things I tried to learn about and understand in the years since I left Scientology I looked for knowledge about how we come to believe things, how we can make errors in our reasoning that makes it possible for us to be "oh so wrong" but very certain we are "oh so right." I tried to learn what I did that contributed to my ending up "about as wrong as a person can be" for decades and steadfast in my confidence that I was "without a doubt" right.

I want to look at how any of us can be on the wrong side of an issue, believe in something that we shouldn't and how we can keep on believing in something when we really should be using better reason.

I have examined things like logical fallacies and cognitive biases in the past and recommend everyone dig into those. I also have looked at rhetoric and propaganda techniques and feel the same way about them. They are essential to understand influence and human behavior.

Here I want to focus on some specific ideas and look at how they help to get and keep us on the wrong track, how they get and keep us using poor reason when better thinking is possible and needed. If I had thoroughly understood these ideas and many others I learned since leaving Scientology I am confident that I wouldn't have been a candidate for recruitment by Scientology as it would have been obvious to me it was not anything like the group it claims to be.


Scientology really encourages adopting certain ways of thinking, feeling and behaving. It encourages a mindset and attitude that is not conducive to independent and critical thinking.



The flaws that Scientology lays down as bad habits for members serve as examples of how these same habits are equally bad reason when used in other contexts. What makes Scientologists into deluded dupes serves to make people in other situations into people using poor judgement and reason and frankly open to creating, holding and defending and retaining ideas that are not well thought out. Sometimes the ideas will be correct and sometimes they will be wrong but the reason behind them will be poor and the confidence for them may be high. Not a good combination.


What exactly am I talking about ? Habits in thought, speech and behavior.

Let me start with some examples.

I found a great list at Philosophy Courses Website: Professor Matt McCormick‎ > ‎
Biases, Fallacies, and Errors in Reasoning

"Going Nuclear is the mistake of artificially elevating the standards of proof against an opposing position and arguing that it is unjustified unless it is known with absolute, deductive, or excessive levels of certainty, particularly when one cannot meet those standards oneself, or those standards are not met for many other claims that we take to be justified. A Congressman is skeptical about the evidence for global warming. He hears many highly qualified scientists review the evidence. They explain that there is a widespread consensus among the best experts in the scientific field that global warming is happening. But the Congressman resists, saying, “Well, perhaps, but in the end is it really possible to prove it? I mean, can you prove with absolute certainty that it is happening? Especially when you might possibly be wrong?” Meanwhile, the Congressman’s doctor tells him that the test results indicate a good chance that he has a bacterial infection, so the Congressman agrees to take antibiotics, even though the evidence for the diagnosis is preliminary. Specifically, the Going Nuclear mistake is elevating the standards of proof to a level that cannot be satisfied by any amount of evidence. It invokes global or extreme skepticism in order to refuse a conclusion. Motivated Reasoning, as we have explained in here, is applying different standards of proof to evidence that is for or against a view you already hold. "

This is one of my favorites. It is easy to see. No evidence is good enough for a position you oppose. Video, studies by experts, scientific consensus, eyewitnesses, mountains of physical evidence, none of it matters.

Someone says a politician you support is a sex criminal. Numerous women making accusations, video and audio tape of the politician bragging about being a sexual predator and assaulting women, numerous supporting witnesses and incidents of infidelity and affairs and hundreds of incidents of insulting women do not matter.

Scientology encourages, even demands it via many methods, you either see Scientology doctrine as infallible and sacred and Scientology founder Ron Hubbard as impeccable in character and accomplishment or there is no place for you in Scientology.

When someone is online and they say "prove it to me" regarding an idea that they are adamantly rejecting it is an example of going nuclear. If they won't examine or consider any evidence or arguments or ideas that could disagree with the idea they are defending then there is no proving anything to them, except what they already know. If they won't listen and know they don't need to read anything that disagrees with them because it is wrong because it disagrees so they have already closed their mind, whether they know it or not.

Another quote from Matt McCormick is useful:

"Non-disconfirmable hypotheses: A thinker is advocating a non-disconfirmable hypothesis when there are no circumstances, no evidence, no arguments, or no scenarios, even hypothetically, under which she would acknowledge that it has been disconfirmed or that it is false." End quote

You need to recognize this. Sometimes you can say "what would convince you your idea is wrong" and if they say "nothing" then you know what is going on. Sometimes they will act like a standard that is not always right is the only standard.

They might act like you need a behavior or admission from someone that is almost impossible for them to consider something that is quite possible without that exact and difficult to get proof.

We all can be guilty of this. Scientology has no monopoly on poor reason.

But it is hard to realize that I am close minded but pretending to be reasonable and open minded. Scientology successfully instilled that attitude and approach in me and a series of fortunate events over decades led to me being incrementally freed up enough to reconsider the underlying assumptions I had been following routinely.

( I described the whole process in extreme detail in the blog post
Getting Into and Getting Out of Scientology - The Lies That Bind also available at Mockingbird's Nest blog on Scientology)

It is a great exercise to think of times that real people have "gone nuclear." They can be famous or not, just as long as they actually did this. They can be any political type or any religion or non-religous, they can be smart or dumb even rude or polite and they can be right or wrong. The approach and mindset is the key, not anything else. I have found with trying to understand fallacies and biases that finding any examples is the best way to start and then if you find people who you disagree with easily, as most people do when thinking of errors in reason, moving to people you agree with is harder and finding times I myself used the error are hardest to find but crucial for dealing with it.

You cannot deal with errors in reason by just acknowledging them in other people. Everyone can see it as the other person's problem and then no one will do anything but finger pointing.

Scientology gave me the gift of realizing that I had made tremendous, unimaginable, errors in reasoning and had done it for decades and had supreme confidence I was totally right.

Many people never get to realize they followed a false prophet and a fraudulent prophecy. I did.

I want to stick with the errors and expand on them.

More from Matt McCormick:

"Motivated Reasoning: People criticize preference inconsistent information with excessive skepticism, while lowering those critical standards for information that would corroborate favored beliefs. That is, they are more critical and demand a higher level of evidence concerning conclusions that conflict with things they already believe, and they are less thoughtful or skeptical when evidence supports their favored views. A prior held belief steers the search for and analysis of information rather than an unbiased gathering and evaluation of evidence leading to the most reasonable conclusion. Motivated reasoning may or may not commit confirmation bias as well. Motivated reasoning is reasoning that is directed at achieving a particular conclusion, no matter what the truth or the evidence indicates. Confirmation bias is a particular kind of filtering of the evidence that leads to a conclusion."

"Confirmation Bias is the mistake of looking for evidence that confirms a favored conclusion while neglecting or ignoring evidence that would disprove it. Susan reads her horoscope and it tells her that Virgos are outgoing. She believes in astrology, and she searches her memory for times when she’s been outgoing. She thinks of a few cases where it seemed to be accurate and concludes that astrology works. Juan thinks that he has prescient dreams, or dreams that tell the future. Out of the thousands and thousands of dreams he’s had over the years, he easily recalls the one or two times that he dreamt about some event and then the next day it seemed to happen. He fails to notice that the vast majority of his dreams did not work out like this. Those thousands of other dreams are easily forgotten and neglected."


"The Umpire Effect is a particular form of motivated reasoning. Enthusiastic fans of sports teams tend to find fault in referee calls that go against their team and accuse the umpire of bias, but they will be more satisfied with referee calls that favor their team. Conservatives often argue that the news media is too liberal, while liberals insist that the news media is too conservative."



"The Sliding Scale Fallacy: Motivated reasoners will apply a more skeptical, critical set of standards against the evidence for views that they oppose, while letting mistakes, sloppiness, or a failure to meet those same standards go in cases where evidence or arguments are being presented in for of conclusions they favor. See Motivated reasoning."

The last few items correspond well to cognitive dissonance theory. Cognitive dissonance theory is a subject in its own right, it is far more than a definition or paragraph or even a book could fully describe. I recommend the book A Theory of Cognitive Dissonance by Leon Festinger for anyone who really wants to understand human thought and behavior. I also wrote the series on cognitive dissonance theory here at Mockingbird's Nest blog on Scientology entitled
Scientology And Cognitive Dissonance Theory.

Motivated reasoning is a huge factor in being biased, applying unfair standards to evidence and thinking that you are being reasonable and logical.

How many times have we accepted as true an article or meme that agrees with what we already believe or want to believe ? And how many times have we doubted or outright dismissed the articles and memes we find that disagree with our beliefs or preferences ?

It is human nature. It takes a lot of hard work, practice and persistence to even spot this in ourselves a little bit.

We usually are carried along following deeply held unexamined assumptions and seeking to feel good or at least content as we find evidence that our beliefs are correct and we often are uncomfortable, confused or disoriented when we find evidence against our beliefs and preferences. It is like being a ball that bounces along and is pleased by and comfortable bouncing on things that agree with our beliefs and desires and being upset and ill at ease with things that disagree with our beliefs and desires, so we try to desperately to steer ourselves to land on the desirable things.

I have seen the sliding scale fallacy a lot, even in people who are not Scientologists and people who have studied critical thinking. Sometimes people who have studied science make the error of thinking they have transcended these errors. At best, at the very best, we might just might able to spot them in ourselves some of the time, not even most of the time, and just might be able to slightly reduce them in ourselves. That might not sound like it is worthwhile but think about this: if you reduce your error rate and it stops you from committing one really huge error - or makes it so you can spot it and change what you are doing - that can save you a lot of trouble. It can save a marriage, a job, a friendship and more. It can save a life.

With the sliding scale fallacy I have a lot of examples, a lot. One that keeps coming up is worth describing. Scientology has a tremendous amount of doctrine, perhaps tens of millions of words on tapes and written in numerous orders, policies, bulletins and books.

In that Hubbard wrote hundreds and hundreds of statements on hypnosis. He studied hypnosis for decades and numerous sources outside of Scientology have confirmed this. Many of his contemporaries who were not Scientologists support this claim.

Scientology has extensive claims on not using hypnosis in any way. Almost every Scientologist will say this. Okay.

In leaving Scientology I did several things. I looked at the article Never Believe a Hypnotist by Jon Atack and his Scientology Mythbusting articles that are all available free at The Underground Bunker blog on Scientology.

I read books on hypnosis such as Trances People Live and Hypnotism Comes of Age, I watched numerous videos by hypnotists and read many articles.


I gathered and presented evidence that Hubbard tried to covertly hypnotize people first through Dianetics and later with Scientology.

I won't present it all here now as it includes dozens of quotes by Hubbard to demonstrate his knowledge of hypnosis and in particular his knowledge of how he was using it in Scientology and his intent to use it to covertly enslave people.

I presented definitions and explanations and examples from hypnosis to untangle what hypnosis is, the basic techniques that it is built on and the effects and phenomena these techniques create, so people, especially Scientologists and ex Scientologists, could know that the effects and phenomena Hubbard described in one way when discussing hypnosis were the same things he described in a completely different way when they are created in Scientology indoctrination and another in Scientology auditing and another in Scientology ethics technology and another in Scientology administrative technology.

I also collected extensive evidence from studies and experiments in psychology and neuroscience to give scientific support to the claims that particular techniques like confusion via contradiction, attention fixation, vivid imagery, repetition, mimicry and repetitive leading questions can influence people.

Imagine a blackboard with a line down the middle. On one side is the evidence I gathered and presented. It is grouped into broad categories and the ideas are summed up into little phrases.

On the side for my claim is a very, very full page with lots and lots of evidence for each of my main ideas.

On the other side ? Sometimes I have presented some of my evidence to a Scientologist or ex Scientologist and they present one or two statements like "I just can't believe it" or "but the indoctrination technology was from someone else originally !"

A couple things, just because we cannot believe something doesn't make it false. Reality doesn't care about our opinion. Lots of unbelievable things have turned out to be true.

Scientology has hundreds of contradictory statements on hypnosis so most people just give up and accept that Hubbard said Scientology is not hypnosis in some of them.

Scientology study technology has been described as being taken in part from a couple who came up with some of the concepts in Scientology Study Technology indoctrination procedures.

Here is a quote from The Underground Bunker blog by Tony Ortega:

More proof that Scientology used the ‘R2-45’ method to intimidate enemies
By Tony Ortega | March 17, 2015

"Charles Berner had been one of L. Ron Hubbard’s most enthusiastic early followers. From 1954 to 1957, he was even president of the Church of Scientology of California, the “mother church” of the organization. According to some oldtimers, Berner and his wife, Ava, were responsible for coming up with the concepts of “Study Tech” which Hubbard later co-opted as his own. By 1965, Hubbard “declared” Berner, and made him the church’s first “enemy number one.” End quote

Now, Chris Shelton was able to conduct an interview with Ava Berner on his YouTube channel entitled The Basics of Scientology: Study Tech. I do not dispute the claim that Hubbard stole some ideas from this couple. I also do not dispute the claim that they sincerely thought they had found some beneficial ideas.

Hubbard took a handful of ideas that others found believable and altered both the ideas and the techniques that these ideas are implemented with to such an extreme degree as to fundamentally change what is being done with them. And Hubbard in several references I quoted laid out how this could be done but he needed the crucial elements of a believable explanation for the indoctrination methods he came up with to fool people.

So, someone could put the story of the Berners on the other side if they felt it was evidence against my claims. Fine.

Here is where we really get to see motivated reasoning and the sliding scale fallacy. Imagine you are an ex Scientologist and you have rejected Scientology, you think...but oddly still are firm in your conviction that Scientology indoctrination is not based on covert hypnosis because you just KNOW it cannot be, despite admitting you know nothing about hypnosis...except Hubbard told you Scientology isn't hypnosis...

You have one anecdote, the taking of some basic terms and a handful of ideas from the Berners but then Hubbard by all accounts dramatically changed many aspects of study technology and added many techniques, ideas and drills and procedures resulting in something far more than what the Berners brought him, something far different.

Isn't the fact that Hubbard ended up with something far different enough to consider that he made it into something that does something else ?

I propose that the ex Scientologist in this position should carefully examine my claims and evidence, but have found all too often that some will not. Their personal incredulity (a logical fallacy, believing something cannot be because you find it hard to believe) and a handy way to PARTIALLY explain the origin of Scientology indoctrination methods is more than enough for some ex Scientologists to reject any evidence and claims WITHOUT EXAMINATION OR CONSIDERATION. Some people I will admit do examine my claims and evidence and find them persuasive. Some examine them and are shocked. Some remain skeptical. But if you do not at least examine them, especially if you were in Scientology, how good is your reasoning on this matter ?

Let me be even more clear. As we receive information we are prone to have feelings in response to the information. If the information contradicts our beliefs and behaviors, especially our deeply held values, we can experience confusion, disorientation, anxiety and discomfort, perhaps even frustration, impatience and anger.

So, in an unthinking effort to alleviate these unpleasant sensations we can reject the information, we can discredit the source and double down on our own beliefs and use all the logical fallacies I posted here to see ourselves as right and the information we disagree with as wrong due to an emotional motivation. It is a motivation to escape escape unpleasant emotion and return to more pleasant emotions.

This is entirely consistent with cognitive dissonance theory.

In A Theory of Cognitive Dissonance Leon Festinger gave some relevant ideas:

When there is a clear and unequivocal reality corresponding to some cognitive element, the possibilities of change are almost nil. (Page 27)

It would still be possible to reduce the dissonance by what also amounts to adding a new cognitive element, but of a different kind. He can admit to himself, and to others, that he was wrong. (Page 29)
Sometimes, however, the resistances against this are quite strong.(Page 29)

A person would expose himself to sources of information which he expected would increase consonance but would certainly avoid sources which would increase dissonance.
(Page 30)
This just seems like human nature. It is a terrible bias toward information that fits what you want to find, with avoidance of contrary evidence. This is terrible for objective analysis or critical thinking or scientific method. It is also bad for relationships. It exists to greatly varying degrees in people and even varies regarding different subjects within an individual.
So in a person considered reasonable and level headed this may be less than in a black and white thinker who is polarized, with unshakeable certainty on everything with no doubt or personal reflection.
Unfortunately, Scientology makes many members into extremely close minded and blindly obedient mental slaves. In this abusive relationship the victim tries to pretend the abuse and anything that could expose it are not true.
The operation of a fear of dissonance may also lead to a reluctance to commit oneself behaviorally. (Page 30)
Hence, it is possible for dissonances to arise and to mount in intensity. A fear of dissonance would lead to a reluctance to take action-a reluctance to commit oneself. (Page 31)

There are several areas of opinion where it is notoriously difficult to change people. (Page 120)

Now an important thing to be aware of is the tendency people have to try to avoid or minimize dissonance most of the time.
One might also expect, however, that at the initial moment of impact of the new dissonant cognition, effective processes could be initiated which would prevent the dissonant elements from ever being firmly established cognitively. One might expect to observe such things as attempts to escape or avoid further exposure, erroneous interpretation or perception of the material, or any other technique or maneuver which will help to abolish the newly introduced dissonance and to prevent the further introduction of dissonance. (Page 134)
That is a way to say that if you realize or suspect information will be against your beliefs automatic responses are switched on that can counterargue against the information. Counterarguing is thinking of claims against the information or reasons to not accept it or using fallacies to avoid the information.
There are other psychological defense mechanisms that are triggered including denial that all can have dissonance inspiring information set as triggers.
And emotional reactions that prevent even accepting the information. This is the anatomy of being close minded and stubborn.
Festinger describes several processes including intentional misunderstanding, thereby avoiding the dissonance, this can occur if the message is open to multiple interpretations or vague.
Also if the message is clear and not capable of alternative conclusions then other methods are utilized. A person may accept a message on the surface but see exceptions or that a particular example is true but that the general principle in question is not.
This is strikingly similar to Hubbard's claim that "suppressive generalities" exist. Many Scientologists and exes embrace this technique to reject without analysis virtually any concepts they wish to avoid.
Festinger quotes the conclusions of others regarding a study.
"...the prejudiced person's perception is so colored by his prejudices that issues presented in a frame of reference different from his own are transformed so as to become compatible with his own views. Quite unaware of the violation of facts he commits, he imposes on the propaganda item his own frame of reference." (Page 136)

Festinger goes on:
Denial of Reality
It sometimes happens that a large group of people is able to maintain an opinion or belief even in the face of continual definitive evidence to the contrary. Such instances may range all the way from rather inconsequential occurrences of short duration to phenomena which may almost be termed mass delusions. (Page 198)
Let us imagine a person who has some cognition which is both highly important to him and highly resistant to change. This might be a belief system which pervades an appreciable part of his life and which is so consonant with many other cognitions that changing the belief system would introduce enormous dissonance. (Page 198)
These two ideas are extremely relevant to Scientology. They almost cannot be stressed enough. Regarding maintaining mass delusions an entire series of books could be written detailing the delusional belief system Scientology requires continuously and the encyclopedias Hubbard assembled detailing how to install and maintain these delusions to enslave his victims mentally.

For my example I want to focus on two of the many things Festinger wrote:

When there is a clear and unequivocal reality corresponding to some cognitive element, the possibilities of change are almost nil. (Page 27)

Sometimes, however, the resistances against this are quite strong.(Page 29)

In my example an ex Scientologist may say "well, Hubbard stole study tech from the Berners, that is undisputed, so case closed."

But what if instead of using good critical thinking to carefully examines the relevant evidence for the claims of both sides the ex Scientologist was in fact using motivated reasoning, and the rest of the fallacy parade to avoid unpleasant emotions and return as quickly as possible to pleasant ones ?

Not every person who examines my claims and evidence will agree with me. But there is a world of difference between making a good effort to really understand the case for and against ideas and to try to as objectively as possible consider and carefully weigh the facts.

Scientology sets people up to habitually think in fallacies, to have deeply held emotionally charged assumptions and to not examine those assumptions and instead to fiercely defend them. Scientology uses emotions to make doubting ideas from Scientology incredibly uncomfortable and to make hanging onto those ideas comfortable. Those ideas usually include believing study tech and the concepts, phenomena, interpretation of phenomena and techniques are valid and true and that any other view, including the idea that Hubbard used hypnosis and covert persuasion in Scientology indoctrination, is inconceivable and ludicrous.

But just because Hubbard put a lot of work into getting people to believe something doesn't make it true, even if it feels like it must be true.

I have written a lot on the evidence for Hubbard having tried to covertly hypnotize people through study technology and Scientology auditing.

Here are a couple quotes I found relatively recently:
"If you can produce enough chaos — it says in a textbook on this subject — if you can produce enough chaos you can assume the total management of a psyche — if you can produce enough chaos.
The way you hypnotize people is to misalign them in their own control and realign them under your control, which necessitates a certain amount of chaos, don’t you see?
Now, the way to win through all of this is simply to let the guy have his stable data, if they are stable data and if they aren’t, let him have some more that are stable data and he’ll win and you’ll win.
In other words, you can take any sphere — any sphere which is relatively chaotic and throw almost any stable datum into it with enough of a statement and you will get an alignment of data on that stable datum. You see this clearly?
The whole society is liable to seize upon some stupid stable datum and thereafter this becomes a custom of some sort and you have the whole field of morals and mores and so forth stretching out before your view."

Hubbard, L. R. (1955, 23 August). Axiom 53: The Axiom Of The Stable Datum. Academy Lecture Series/Conquest of Chaos, (CofC-2). Lecture conducted from Washington, DC.


"Another way to hypnotize somebody would be to put him in the middle of chaos, everything going in all directions, everybody shooting at him and suddenly throw him a stable datum, and make it a successful stable datum so that it’s all called off once — the moment he grabs this. And this gives you the entire formula of brainwashing: interrogate, question, lights, pain, upset, accusation, duress, fear, privation and we throw him the stable datum. We say, “If you’ll just adopt ‘Ughism’ which is the most wonderful thing in the world, all this will cease,” and finally the fellow says, “All right, I’m an ‘Ugh.’ ” Immediately you stop torturing him and pat him on the head and he’s all set.Ever after he would believe that the moment he deserted “Ughism,” he would be drowned in chaos and that “Ughism” alone was the thing which kept the world stable; and he would sell his life or his grandmother to keep “Ughism” going. And there we have to do with the whole subject of loyalty, except — except that we haven’t dealt with loyalty at all on an analytical level but the whole subject of loyalty is a reactive subject we have dealt with. "


Author: Hubbard, L. R.
Document date: 1955, 21 September, 1955, 21 September
Document title: Postulates 1,2,3,4 In Processing - New Understanding of Axiom 36, Postulates 1,2,3,4 In Processing - New Understanding of Axiom 36

In the blog posts 1) Insidious Enslavement: Study Technology and 2) Basic Introduction to Hypnosis in Scientology and 3) Burning Down Hell - How Commands Are Hidden, Varied And Repeated In Scientology To Control You As Hypnotic Implants and 4) Why Hubbard Never Claimed OT Feats And The Rock Bottom Basis Of Scientology I took on laying out the foundation of Scientology relying upon cognitive dissonance and hypnosis.
I elaborated on the book A Theory Of Cognitive Dissonance by Leon Festinger in a series of blog posts. I combined all eleven of them together in one long post Scientology And Cognitive Dissonance Theory.
The point of this post is not for EVERYONE to read everything I put out and have identical beliefs to my own. Or even to just read everything I put out. Taking on that much is really just for the most serious students regarding cults, Scientology and persuasion and people with a strong personal stake in understanding or recovering from Scientology.

The point is is that the same human nature - including psychology with cognitive dissonance and biases and logical fallacies - what makes being and staying duped by Scientology possible is present in ALL of us. We all can be fooled and manipulated. And we all have the potential to STAY wrong if we are not meticulously scrutinous regarding the things we don't want to examine, the things that clash with our worldview, that contradict out most cherished beliefs. We should look at our unexamined assumptions and see are they valid ? Are they well supported by facts and solid evidence ? What is the best evidence and the best arguments against them ? Is there new information we have not considered that is relevant to them ?

I am going to be frank. Critical thinking is HARD. It goes against many habits we usually have as first nature. Leonard Mlodinow wrote in the book Subliminal on how our minds are very good as lawyers, selecting and manipulating information for our benefit and very poor as scientists, carefully using scientific standards and gathering evidence objectively both for and against out desires and letting the quest for knowledge and truth guide us. Jonathan Haidt in his book The Righteous Mind uses a metaphor of the intellect as a rider and emotion as an elephant. Imagine if you as a man or woman try to get an elephant you are riding to go somewhere. Imagine if the elephant is determined to go towards something.

I have to agree with the critical thinking expert Richard Paul who wrote Critical Thinking with his wife Linda Elder, critical thinking is a way of doing things, an approach that takes a lot of work, it is something you have to carefully keep working at and trying to return to and improve upon. You have to challenge yourself and keep getting more and more knowledge on multiple subjects related to critical thinking or you are not really doing it.

I highly recommend the book Critical Thinking and the YouTube videos by Richard Paul. I wrote the blog post
Cornerstones of Critical Thinking 1 - 8 Introduction to Critical Thinking here at Mockingbird's Nest.

Scientology gives many of us a gift when we reject it. We know we are gullible, we know we can be wrong, protection of that knowledge takes work. It takes learning about the ways we can be wrong and the ways we fool ourselves and keep ourselves fooled. Probably nothing short of the miraculous can remove all of the flaws and errors in human thought, to be human is to err as Alexander Pope wrote. Scientology in many ways is a a kind of warped mirror. It shows aspects of life that exist but has some exaggerated and turned way up and others are almost absent. It is a surreal experience to be in Scientology and a surreal experience to honestly examine it in depth.


I hope that in examining Scientology for whatever reasons we have, no matter what background we start at, we can understand ourselves better and understand each other better.


I tried to pick the particular errors in reason I did to start this with because they are ones most likely to generate resistance to even considering that you or I can be wrong in ourselves.
I would be interested in knowing who you are writing these dissertations for. They are quite a bit of writing and seem to be well thought out and researched.

Obviously, they are not for me, I think they are very tl;dr, I don't need the opinion/information and I do not like being lectured to -- so, out of curiosity, I would like to know who you are writing to?
 

Clay Pigeon

Gold Meritorious Patron
Wow!

A little too long for a full reading but I gave it a quality skim.

Twentyfive years in hunh? I can't blame you for washing it all off

How much staff?

Sea org?

How far on grade chart?
 

mockingbird

Silver Meritorious Patron
Wow!

A little too long for a full reading but I gave it a quality skim.

Twentyfive years in hunh? I can't blame you for washing it all off

How much staff?

Sea org?

How far on grade chart?
I was on and off staff for a few years, join for two months or so, not get paid. Get a real job in frustration, try coming back or be part time and work another job. I was in the training program for KTL and LOC at LA. I was in the Sea Org and at the Excalibur building for one summer. I spent a lot of time, hundreds, probably thousands of hours in a Scientology course room.
 

Dotey OT

Cyclops Duck of the North - BEWARE
I know this will seem non-sequitur, but just bear with me.

I once broke up with a person, and it really hurt, hard. In the midst of all the anguish, I thought it wise to develop sort of a "wish list"(for lack of a better term, screw "wanted and needed" lol) for a mate. I wrote this up as sort of a "Don't want this, but want this" sort of list. Rather extensive. My soon to be ex and I got through that period of time, and we both politely, and as friends, went our separate ways. We are still friends to this day. I went on and had several relationships before I settled down with my current partner. Years later I found that list, and oh boy was it funny! Most of it was me just trying like hell to find reasons for the break up. These reasons at the time seemed really good!! But looking back at it, it was me just obviously being a quite a bit butt hurt.

Looking at this post, and others like it from before, I see lots of interesting data, and that data does cause me to look at my scn experience. I too was in about 25 years, and have been out now about two or three years. I don't go through a day where I don't trip into some wedged-in thinking disability, either self-installed or surgically implanted by flubtard via auditing, ethics or whatever. And I know there's lots more. I also know I am not the easiest person to get along with, but I do have some good in me!!

But my main point is this: Hell, I need no more reason than me just knowing that there was lots wrong with it to dispense with it. I don't know your particular experience, and I hope for you that you haven't lost a dear one or ones as a result.

Oh, and I do read your posts, I hope this doesn't come across as a criticism.

I would like to hear about your experiences too!
 

Clay Pigeon

Gold Meritorious Patron
Wow

Another over two decades in...

The damn thing has it's damn totalitarian outlook and structure

Especially since Ron died.

It was non authoritarian in the early days and still loose enough when I had my measly Three years active

You longtermers seem to be consistent in having what I've found of lasting value so fused with the koolaid you have to shitcan it all
 

Dotey OT

Cyclops Duck of the North - BEWARE
You longtermers seem to be consistent in having what I've found of lasting value so fused with the koolaid you have to shitcan it all

Well, if flubtard said it's better to have more money than less money, I suppose that I'd have to agree with him.
 

mockingbird

Silver Meritorious Patron
The Easiest Person to Fool 2 Hierarchy of Disagreement


"The first principle is that you must not fool yourself, and you are the easiest person to fool."
from lecture "What is and What Should be the Role of Scientific Culture in Modern Society", given at the Galileo Symposium in Italy (1964) Richard Feynman American theoretical physicist

“There are two ways to be fooled. One is to believe what isn't true; the other is to refuse to believe what is true.”

― Soren Kierkegaard


Paul Graham wrote an article about how to disagree. It is in my opinion useful for people who want to use reason in debates or conversation. He lays out a hierarchy of disagreement. I think it is beneficial because we are emotional creatures who occasionally reason (I didn't come up with that idea and do not know who to credit the quote to).

To try to put it simple when we are at a level of name calling and personal insults and similar levels of reason we are using poor critical thinking. It isn't a matter of my feelings or preference.


Any serious examination of the brain includes grouping the regions into a three sections grouping. Various theories on this have emerged and one called the Triune brain theory has been discredited as a fine understanding of the exact details of human evolution, but it has a useful classification for a high school level understanding in some details. At a higher level of education it's transcended by more nuanced information.

In the most basic details it's accurate enough for our purposes here. There's a reptile brain section that does jobs like regulating breathing, eating, heart rate and primitive emotions regarding fear and aggression. It's the fight or flight associated section. All vertebrate animals have this brain structure.
Second is the limbic system, this is more complex and the source of social perception. It's seen as instrumental in the formation of social emotions. In humans it's often defined as including the ventromedial prefrontal cortex (associated with very emotional decisions), dorsal anterior cingulate cortex, amygdala (associated with fear and aggression), hippocampus, hypothalamus, components of the basal ganglia and the orbitofrontal cortex. You don't have to memorize all those regions. Some are described in study after study and their association with primitive emotions becomes obvious. Others are described less often.

Many structures in the limbic system are often grouped together and called the old mammalian brain. All mammals have these structures on top of the reptile brain.

On top of this is the neocortex or new cortex is in the new mammalian brain. We as humans even have the prefrontal cortex and regions no other creatures have. Our social complexity, ability to recognize things like emotions via facial expressions and understand Theory Of Mind greatly transcend all other living creatures and our highly developed new mammalian brains and unique prefrontal cortex give us the edge in these aptitudes and others unique to our species. It's debatable whether there's intelligent life out there but the closest thing we have found yet on earth is ourselves, immodest as that sounds.

I have to stress the reptile brain, old mammalian brain and new mammalian brain is something of an oversimplification at the level of education you would seek for a college level understanding of the brain, but a useful metaphor for our level of education.

There's a portion of the brain that professor Robert Sapolsky associates with very emotional reactions and decisions, the ventromedial prefrontal cortex or VMPC. It's got a counterpart called the DLPC. Sapolsky describes the VMPC as very emotional and the DLPC as deliberate. It's associated with careful and logical decisions.

Here are some quotes from Psychologist Nicole Currivan

She was quoted in an article entitled The Neuroscience of How Personal Attacks Shut Down Critical Thinking. I will use some excerpts.

"First we need to know a bit about two regions of the brain that are fairly at odds with one another.
The prefontal cortex, which…is in the front of the brain if you’re facing forward. And the limbic system, which… is a huge chunk of many regions in the center of the brain. The pre-fontal cortex is our executive function. It helps us plan and decide what actions best meet our needs and is responsible for social inhibition, personality, and processing new information. It’s the part that says “you could have garlic bread tonight but you also don’t want to sit alone in the corner”.

The limbic system…is responsible for emotions and formation of memory. It reminds you that you love garlic bread and you were really embarrassed, too, the last time you ate it and no one sat next to you. So the important point about these two areas is: activation of one region generally results in deactivation or inhibition of the other, so they have an inverse relationship. This is because in situations of low or moderate stress, the prefontal cortex inhibits the amygdala. The amygdala is responsible for emotions that relate to the four 4’s: fight, flight, feeding–and mating…
So it makes us feel things like fear, reward, and anger that normally the prefrontal cortex can respond to with a spot of reason and inhibition. In a normal, low stress situation, you want the garlic bread or the cookie, for example, but you can decide whether or not to eat it because your prefrontal cortex is still engaged. And as your stress level may increase it gets harder to make those choices. Your rational thought capacity is there less and less and less to police your emotions when stress increases.

And this is where things can get ugly. If something extremely stressful happens that lights up the amygdala, it has the power to shut down the prefrontal cortex completely. It has this fight or flight or freeze response…and it’s instantaneous. It’s something that evolved for situations in which there is no time for decision making. You can’t think about whether you want garlic bread, you have to drop it and run when you’re confronted with a tiger. And that’s incidentally why people don’t eat when they are stressed, and a lot of other things that happen to our body as part of the stress response.

So there are times, high stress times, when executive decision making processes go completely down to the count and our emotions take over. By threatening somebody, whether it’s real or perceived, you can completely disable people’s their ability to think straight. And this isn’t all or nothing, it’s on a continuum. A threat can be anything that causes stress from the tiger to just an uncomfortable thought. The level of stress will influence the amount of rational thought vs. emotion that’s available and it’s totally subjective to the perceived experience of stress.

And, adding to that, increased stress and emotion can influence memory. More emotion leads to stronger memories. And those memories last longer, especially if it’s a negative emotion. We all remember where we were the morning of 9/11. Last Tuesday? Not so much. And it makes sense that our brains do this since emotions fear and anger are about events we really want to be prepared for in case they happen again. At this point you’ve probably figured out that if your goal is to get someone to process new information and think critically about stereotypes (like [that] atheists are criminals or they should die) the absolute last thing we want is for them to feel threatened or attacked. The worst part about this is if you combine the process I just described with the sorts of negative emotional responses triggered by stereotypes and other biases, you can see that someone, if they’re all stressed by their perception of you…you’ve lost them, they’re not going to be able to listen. And you’ve additionally probably just given them a fun bad new memory to hang onto. "

"First, as fun as some of you may think it is to attack and argue and ridicule people, just be aware that that will legitimately slam the door to rational understanding–of any point you have. And if you can’t call the discussion you’re having calm and rational, you are in serious danger of indulging your own emotional satisfaction to the point where you’re reinforcing someone’s distrust in all of us. And starting with the premise that someone needs to change or the inherent assumption that “I know more than you” will definitely create a strong stress response and pushback as well. Something we all inherently know but we do it anyway.

Second, if you want to reduce stigma, it’s essential to reduce limbic system activation as much as possible whenever you’re talking to somebody. Any kind of threat, real or perceived, in the current moment or even just something they remember, something bad that they remember about the stigma that’s on the person they’re talking with, can shut down their ability to take in new information. And shuts down possibility for change. So fear is really the enemy of trust in this case and it’s mistrust that the studies have found people have for atheists.

Third, if you want to change people’s opinion of you, making the conversation rewarding for them will definitely increase the likelihood that will happen. The less stressed they are, the more their brain will receive and process new information.

Fourth, I haven’t even begun to scratch the surface here with applicable brain science, but consider that emotions are highly contagious. And, unfortunately, negative emotions are more contagious than positive ones. So your stress will definitely spread throughout a room. And it also doesn’t work to hide your stress from people because it actually makes their blood pressure go up if you try. So don’t try to change people’s thinking about you if you’re stressed or in a bad mood, just wait until you can be calm and pleasant so it can be rewarding for everybody. " end quote. Nicole Currivan psychologist

I could try to dig up lots of articles on the brain and limbic system and amygdala and executive function but the hypothesis that is currently accepted (like any hypothesis it could be falsified in the future) is that when our limbic system and amygdala are triggered we have our critical thinking impaired. We do poor at thinking when we are swept up in strong emotions and in particular fear, anger and as Jon Atack pointed out to me disgust.

Anyone very interested in this can read the simple introductory book The Brain by David Eagleman or the absolutely brilliant Subliminal by Leonard Mlodinow or the truly challenging in-depth analysis of Behave by Robert Sapolsky.

I wrote a long post on Subliminal at Mockingbird's Nest blog on Scientology and want to just point out a few things that correspond to with points Nicole Currivan made.

In chapter 7 (Sorting People and Things) of his book Subliminal, Leonard Mlodinow took on the human tendency to place people and things in categories. He started with the example of a list of twenty groceries being difficult to remember just from hearing them said aloud. But if they are sorted into categories like vegetables, cereals, meats, snacks etc then it's easier to remember them.


Mlodinow wrote, "categorization is a strategy our brains use to more efficiently store information." (Page 145)


"Every object and person we encounter in the world is unique, but we wouldn't function very well if we perceived them that way. We don't have the time or the mental bandwidth to observe and consider each detail of every item in our environment." (Page 146)

Mlodinow wrote, "If we conclude that a certain set of objects belongs to one group and a second set of objects to another, we may then perceive those in different groups as less similar than they really are. Merely placing objects in groups can affect our judgment of those objects. So while categorization is a natural and crucial shortcut, like our brain's other survival-oriented tricks, it has its drawbacks." (Page 147)


Mlodinow described an experiment in which people were asked to judge the length of lines. Researchers put several lines in a group A and others in a group B. Researchers found people thought lines that are in a group together are closer in length than they actually are and the difference in length between lines from different groups is different than it really is. Similar experiments with color differences and groups and guessing temperature changes in a thirty day period within one month or from the middle of a month to the middle of the next month is seen as more extreme. Same number of days but just saying it's a different month increases the estimate of change.


The implications are stunning. If people can be placed in categories and thought of as fundamentally defined by those categories we easily can misjudge people.


This reminds me of a terrible quote:
“The leader of genius must have the ability to make different opponents appear as if they belonged to one category. ” ―Adolf Hitler


That's a reminder of a terrible problem with human behavior and categorization.


Mlodinow wrote, "In all these examples, when we categorize, we polarize. Things that for one arbitrary reason or another are identified as belonging to the same category seem more similar to each other than they really are, while those in different categories seem more different than they really are. The unconscious mind transforms fuzzy differences and subtle nuances into clear-cut distinctions. Its goal is to erase irrelevant detail while maintaining information on what is important. When that's done successfully, we simplify our environment and make it easier and faster to navigate. When it's done inappropriately, we distort our perceptions, sometimes with results harmful to ourselves and others. That's especially true when our tendency to categorize affects our view of other humans--when we view the doctors in a given practice, the attorneys in a given law firm, the fans of a certain sports team, or the people in a given race or ethnic group as more alike than they really are." (Page 148)


Mlodinow wrote on how the term "stereotype" was created by French printer Firmin Didot in 1794. It was a printing process that created duplicate plates for printing. With these plates mass production via printing was possible.


It got its modern use by Walter Lippmann in his 1922 book Public Opinion. Lippmann is perhaps best known nowadays as a person frequently quoted by noted intellectual and American dissident Noam Chomsky. Chomsky has criticized the use of propaganda to manage populations by the government, wealthy individuals, corporations and media.


From Subliminal Mlodinow quoted Lippmann, "The real environment is altogether too big, too complex, and too fleeting for direct acquaintance...And although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage with it." (Page 149) Lippmann called that model stereotype.


Lippmann in Mlodinow's estimation correctly recognized the source of stereotypes as cultural exposure. In his time newspapers, magazines and the new medium of film communicated in simplified characters and easily understood concepts for audiences. Lippmann noted stock characters were used to be easily understood and character actors were recruited to fill stereotypes.


Mlodinow wrote, "In each of these cases our subliminal minds take incomplete data, use context or other cues to complete the picture, make educated guesses, and produce a result that is sometimes accurate, sometimes not, but always convincing. Our minds also fill in the blanks when we judge people, and a person's category membership is part of the data we use to do that." (Page 152)


Mlodinow described how psychologist Henri Tajfel was behind the realization that perceptual biases of categorization lie at the root of prejudice. Tajfel was behind the line length studies that support his hypothesis. Tajfel was a Polish Jew captured in France in World War II. He knew a Frenchman would be treated as an enemy by the Nazis while a French Jew would be treated as an animal and a Polish Jew would be killed.


He knew how he would be treated was entirely limited by the category he was placed in. Being a Polish Jew was a guarantee of death and so he impersonated a French Jew and was liberated in 1945. Mlodinow wrote, "According to the psychologist William Peter Robinson, today's theoretical understanding of those subjects "can almost without exception be traced back to Tajfel's theorizing and direct research intervention." (Page 153)

Mlodinow wrote, "The challenge is not how to stop categorizing but how to become aware of when we do it in ways that prevent us from being able to see individual people for who they really are." (Page 157) contrast this with

"At this point you’ve probably figured out that if your goal is to get someone to process new information and think critically about stereotypes (like [that] atheists are criminals or they should die) the absolute last thing we want is for them to feel threatened or attacked. The worst part about this is if you combine the process I just described with the sorts of negative emotional responses triggered by stereotypes and other biases, you can see that someone, if they’re all stressed by their perception of you…you’ve lost them, they’re not going to be able to listen. And you’ve additionally probably just given them a fun bad new memory to hang onto. " Nicole Currivan



Mlodinow wrote, "The stronger the threat to feeling good about yourself, it seems, the greater the tendency to view reality through a distortion lens."
(Page 197) compare to "First, as fun as some of you may think it is to attack and argue and ridicule people, just be aware that that will legitimately slam the door to rational understanding–of any point you have. And if you can’t call the discussion you’re having calm and rational, you are in serious danger of indulging your own emotional satisfaction to the point where you’re reinforcing someone’s distrust in all of us. And starting with the premise that someone needs to change or the inherent assumption that “I know more than you” will definitely create a strong stress response and pushback as well. Something we all inherently know but we do it anyway." Nicole Currivan


"Second, if you want to reduce stigma, it’s essential to reduce limbic system activation as much as possible whenever you’re talking to somebody. Any kind of threat, real or perceived, in the current moment or even just something they remember, something bad that they remember about the stigma that’s on the person they’re talking with, can shut down their ability to take in new information. And shuts down possibility for change. " Nicole Currivan



"As the psychologist Jonathan Haidt put it, there are two ways to get at the truth: the way of the scientist and the way of the lawyer. Scientists gather evidence, look for regularities, form theories explaining their observations, and test them. Attorneys begin with a conclusion they want to convince others of and then seek evidence that supports it, while also attempting to discredit evidence that doesn't. The human mind is designed to be both a scientist and an attorney, both a conscious seeker of objective truth and an unconscious, impassioned advocate for what we want to believe. Together these approaches vie to create our worldview." (Page 200)


Mlodinow went on, "As it turns out, the brain is a decent scientist but an absolutely outstanding lawyer. The result is that in the struggle to fashion a coherent, convincing view of ourselves and the rest of the world, it is the impassioned advocate that usually wins over the truth seeker." (Page 201)


Mlodinow described how we combine parts of perception and filling in blanks with self approving illusions. We give ourselves the benefit of the doubt unconsciously and do it over and over in hundreds of tiny ways without conscious awareness. Then, our conscious mind innocently looks at the distorted final product and sees a seemingly perfect, consistent and logical representation of reality as memories with no clue it's not anything but a pure recording of the past.


Mlodinow described how psychologists call this kind of thought "motivated reasoning." He explained how the way we easily get this is due to ambiguity. Lots of things that we sense aren't perfectly and absolutely clear. We can acknowledge some degree of reality but somewhat reasonably see unclear things in ways that give ourselves every benefit of the doubt. We can do it for allies, particularly in comparison to our enemies. We can see in-group members as good, if it's unclear and out-group members as bad if it's unclear. We can set standards extremely high to accept negative evidence against ourselves and our groups or set standards extremely low to accept negative evidence against out-groups. We can act reasonable about it, but really are using how we feel about beliefs to determine our acceptance of those beliefs, substituting comfort with acceptance for proof being established.


Mlodinow described how ambiguity helps us to understand stereotypes for people we don't know well and be overly positive in looking at ourselves. He described studies and experiments that strongly support the idea we are incorporating bias in our decisions unknowingly.


Crucially Mlodinow added, "Because motivated reasoning is unconscious, people's claims that they are unaffected by bias or self-interest can be sincere, even as they make decisions that are in reality self-serving." (Page 205)


Mlodinow described how recent brain scans show our emotions are tied up in motivated reasoning. The parts of the brain that are active in emotional decisions are used when motivated reasoning occurs, and we can't in any easy way divorce ourselves from that human nature.


Numerous studies have shown we set impossibly high standards to disconfirm our beliefs, particularly deeply held emotional beliefs like religious and political beliefs. We set impossibly low standards for evidence to confirm our beliefs.


We also find fallacies or weaknesses in arguments, claims and sources of information we disagree with while dropping those standards if the information supports our positions. It's so natural we often don't see it in ourselves but sharply see it in people with opposite beliefs. They look biased and frankly dimwitted. But they aren't alone in this.


We see ourselves as being rational and forming conclusions based on patterns of evidence and sound reason, like scientists but really have more lawyer in us as we start with conclusions that favor us and our current beliefs, feelings, attitudes and behaviors and work to find a rational and coherent story to support it.

Mlodinow ended his book, "We choose the facts that we want to believe. We also choose our friends, lovers, and spouses not just because of the way we perceive them but because of the way they perceive us. Unlike phenomena in physics, in life, events can often obey one theory or another, and what actually happens can depend largely upon which theory we choose to believe. It is a gift of the human mind to be extraordinarily open to accepting the theory of ourselves that pushes us in the direction of survival, and even happiness." (Page 218) end quote


So, between our prejudices, unseen biases and tendency to shut down critical thinking is there any hope ? Crazy as it sounds, we might actually want to have interactions with other human beings that are more than insults and close minded exchanges of motivated reasoning.

In 2008 computer programmer Paul Graham wrote How To Disagree and I am going to quote an excerpt. I think it is a useful reference and starting point for many discussions. Scientology and other cults encourage poor critical thinking, lack of recognition of individual qualities of people and items within categories and emotionally charged reactions to information. We all at times can get sidetracked from addressing main or relevant points to go off on tangents.


So, a guideline as to what to stick with or try to follow to stay on topic or on point is useful. If several people read it, understand it, understand that using tactics that trigger being more emotional often impair good judgement, impair the ability to to see individual people and situations and circumstances as they are with many subtle differences and unique details, and impairs even the ability to receive information and remember it accurately.


If behavior that cults like Scientology encourage that lowers the level of thinking can be identified and avoided thus making debates and discussions more rational wouldn't that be a good thing ?


Now I personally think that name-calling, ad hominem attacks, tu quoque (meeting criticism with criticism), the genetic fallacy (addressing the genesis or source of a claim rather than the claim), appeal to authority (treating a source as so infallible that a claim they support is treated as proven by their support without examination), glittering generalities or hasty generalities or red herring fallacies should be discouraged.


As none of us are perfect they will pop up. When one does if someone tells me I SHOULD pause and consider the claim that I have used a fallacy. Sometimes I will recognize that and can restate a claim or withdraw it. This doesn't mean a claim by the other person "wins" or is proven true. Thinking your claim is proven by the use of a fallacy in a claim in opposition to yours is the fallacy fallacy. I didn't make that up.


If I say "the earth is flat" and you say "only an idiot would say that", you just used a fallacy in your claim but it is irrelevant to mine. By the way I believe in a roughly globe shaped earth. So, we can acknowledge the presence of a fallacy in a claim and not treat it as life threatening or world shattering. Just because someone uses a fallacy in a claim doesn't mean their overall concept is wrong, sometimes it is and sometimes it isn't. Additionally, sometimes a person will claim a fallacy is being used when it is not. You can point this out.


I have many times posted articles critical of Scientology, for example, and got insults in response. I have asked that the person address my claims and not use ad hominem attacks many times in response. Curiously, many times the people defending Scientology often tell me I am using ad hominem attacks and they are not. The great thing about the internet is you can re-read comments and SEE what is there.


Often I post something saying Scientology is a harmful fraud, for example. Someone responds "only an idiot who never actually did anything in Scientology would ever say that because they lack the spiritual awareness to even comprehend Scientology." Okay.


I respond by telling them I was in Scientology for twenty five years, so their claim is false. They respond "then you must have not REALLY done Scientology ! Because it ALWAYS works when done properly !"


So , they are saying "if Scientology gets the results you want it is genuine but if it fails it is something else", but they have married their claims to unproven assumptions about me. Those are what is helping to turn down and hold do their critical thinking. And that is the barrier to beneficial communication.


You can call me any insult you can think up, and people certainly have. Go ahead, knock yourself out. It doesn't make them true. Not even a little bit. It also doesn't leave you in great shape for thinking.


Scientology gets people to think and feel in extremes. Extreme hate, extreme arrogance, extreme condescension, extreme disgust. These are not useful for critical thinking.


For a Scientologist or Independent Scientologist who chooses to engage with people who criticize Scientology I have a question - why can't you respond to criticism without using fallacies ? And the mind reading of "knowing my thoughts and motives and beliefs" without evidence is a fallacy. Scientology encourages people to think that critics have certain beliefs and motives and feelings but without good evidence deciding that a critic or person who disagrees with you thinks or feels something is making an unfalsifiable claim. It cannot be tested, it cannot be proven or disproven. It is a red herring and waste of time. If you believe it then it is a way to accept stereotypes and see people who disagree as broad categories and to reduce your own critical thinking. It may be emotionally comforting but it is terrible for critical thinking. You shouldn't rely on an assumption of knowing what someone else thinks. It's just not good logic.


We all do it. Human nature includes estimating the beliefs, feelings and behavior of other people. A large part of our thinking is devoted to it. But it is not a scientific approach, it is instinctive, largely unconscious, influenced by biases and highly inaccurate. So it is best set aside for careful examination of ideas in discussion. Just because I guess you think such and such is irrelevant to our conversation, unless that is the actual topic.


I dug up all this stuff for Scientologists who are going to keep on defending Scientology and resorting to the fallacy laden tactics Ron Hubbard required in Scientology doctrine. Everyone who was in Scientology for a long time and is familiar with a few dozen fallacies can see that Hubbard required them and frequently used them himself. Scientology is packed with generalities both glittering and hasty, appeal to authority regarding Hubbard, the genetic fallacy regarding critics and on and on. Scientology is set up to get you to think in those fallacies so you can never escape it.


And by getting members to become so emotional when criticism is detected, or even possible, it is intended to make any evidence against Scientology irrelevant because there is no critical thinking engaged to receive let alone examine it. Thousands and thousands of people have collected good arguments and evidence against Scientology to present to loved ones in Scientology only for it to prove worthless because they had no receptive audience to talk to.


Remember motivated reasoning is largely unconscious. We don't know when we are doing it. We can know we are more likely to do it if we get in a mode of using personal attacks and other fallacies. We can try to discipline ourselves to avoid insults, deciding the feelings and thoughts of people, using stereotypes and getting sidetracked in conversation. If a person starts with a central point then we can try to stick with that, not pivot off based on out feelings. If someone brings up criticism of someone or something I like or support for example feeling uncomfortable then solving that by pivoting to criticism of them or something or someone else is poor critical thinking. If I say for example that Ron Hubbard was convicted of fraud in France and you say a different court case with different people decades later was overturned that is completely irrelevant and we should both know it.


Everyone is free to behave however they choose. Plenty of people are going to insult you if you say much at all. Plenty will make up their unwarranted conclusions about your behavior, thoughts and feelings. That is life. It doesn't mean they are right. If they say it over and over and write it over and over it doesn't make it true.


If you want to have the opportunities to use your most logical, most rational thinking I recommend you remember that what they are doing is lowering their own rationality. They are diminishing their own thought. I can't pretend anyone can magically transform into a Vulcan like Spock and instantly become a perfectly logical being. That is not the point at all.


The point is we all can try to consider these ideas and if we believe it is likely true that resorting to personal attacks and similar measures reduces or shuts down critical thinking then we can strive to identify in ourselves when we do them and try to reduce it.


One last thing about critical thinking. Occasionally I see someone attack myself or Chris Shelton because we recommend critical thinking and write about it. It is not a magical transformation or religious practice. It is an approach to thinking in which you seek to improve the quality of your thinking. That is it. It doesn't mean you have achieved an enlightened state or higher awareness. It is just making an effort to fuck up less regarding thinking. So, if you can find any errors or flaws in the thinking, beliefs or conduct of someone who encourages critical thinking and you think " Aha ! I caught you ! You are not perfect ! So I can criticize you any way I want because you are a liar and a hypocrite !" You have missed the whole point of critical thinking. It is recognizing our flawed nature and trying to succeed despite it, not pretending to have overcome it.








March 2008
The web is turning writing into a conversation. Twenty years ago, writers wrote and readers read. The web lets readers respond, and increasingly they do—in comment threads, on forums, and in their own blog posts.


Many who respond to something disagree with it. That's to be expected. Agreeing tends to motivate people less than disagreeing. And when you agree there's less to say. You could expand on something the author said, but he has probably already explored the most interesting implications. When you disagree you're entering territory he may not have explored.
The result is there's a lot more disagreeing going on, especially measured by the word. That doesn't mean people are getting angrier. The structural change in the way we communicate is enough to account for it. But though it's not anger that's driving the increase in disagreement, there's a danger that the increase in disagreement will make people angrier. Particularly online, where it's easy to say things you'd never say face to face.


If we're all going to be disagreeing more, we should be careful to do it well. What does it mean to disagree well? Most readers can tell the difference between mere name-calling and a carefully reasoned refutation, but I think it would help to put names on the intermediate stages.

So here's an attempt at a disagreement hierarchy:

DH0. Name-calling.


This is the lowest form of disagreement, and probably also the most common. We've all seen comments like this:
u r a fag!!!!!!!!!!
But it's important to realize that more articulate name-calling has just as little weight. A comment like
The author is a self-important dilettante.
is really nothing more than a pretentious version of "u r a fag."


DH1. Ad Hominem.


An ad hominem attack is not quite as weak as mere name-calling. It might actually carry some weight. For example, if a senator wrote an article saying senators' salaries should be increased, one could respond:
Of course he would say that. He's a senator.

This wouldn't refute the author's argument, but it may at least be relevant to the case. It's still a very weak form of disagreement, though. If there's something wrong with the senator's argument, you should say what it is; and if there isn't, what difference does it make that he's a senator?


Saying that an author lacks the authority to write about a topic is a variant of ad hominem—and a particularly useless sort, because good ideas often come from outsiders. The question is whether the author is correct or not. If his lack of authority caused him to make mistakes, point those out. And if it didn't, it's not a problem.


DH2. Responding to Tone.


The next level up we start to see responses to the writing, rather than the writer. The lowest form of these is to disagree with the author's tone. E.g.
I can't believe the author dismisses intelligent design in such a cavalier fashion.
Though better than attacking the author, this is still a weak form of disagreement. It matters much more whether the author is wrong or right than what his tone is. Especially since tone is so hard to judge. Someone who has a chip on their shoulder about some topic might be offended by a tone that to other readers seemed neutral.
So if the worst thing you can say about something is to criticize its tone, you're not saying much. Is the author flippant, but correct? Better that than grave and wrong. And if the author is incorrect somewhere, say where.

DH3. Contradiction.


In this stage we finally get responses to what was said, rather than how or by whom. The lowest form of response to an argument is simply to state the opposing case, with little or no supporting evidence.


This is often combined with DH2 statements, as in:
I can't believe the author dismisses intelligent design in such a cavalier fashion. Intelligent design is a legitimate scientific theory.


Contradiction can sometimes have some weight. Sometimes merely seeing the opposing case stated explicitly is enough to see that it's right. But usually evidence will help.


DH4. Counterargument.


At level 4 we reach the first form of convincing disagreement: counterargument. Forms up to this point can usually be ignored as proving nothing. Counterargument might prove something. The problem is, it's hard to say exactly what.


Counterargument is contradiction plus reasoning and/or evidence. When aimed squarely at the original argument, it can be convincing. But unfortunately it's common for counterarguments to be aimed at something slightly different. More often than not, two people arguing passionately about something are actually arguing about two different things. Sometimes they even agree with one another, but are so caught up in their squabble they don't realize it.


There could be a legitimate reason for arguing against something slightly different from what the original author said: when you feel they missed the heart of the matter. But when you do that, you should say explicitly you're doing it.


DH5. Refutation.

The most convincing form of disagreement is refutation. It's also the rarest, because it's the most work. Indeed, the disagreement hierarchy forms a kind of pyramid, in the sense that the higher you go the fewer instances you find.
To refute someone you probably have to quote them. You have to find a "smoking gun," a passage in whatever you disagree with that you feel is mistaken, and then explain why it's mistaken. If you can't find an actual quote to disagree with, you may be arguing with a straw man.


While refutation generally entails quoting, quoting doesn't necessarily imply refutation. Some writers quote parts of things they disagree with to give the appearance of legitimate refutation, then follow with a response as low as DH3 or even DH0.
DH6. Refuting the Central Point.


The force of a refutation depends on what you refute. The most powerful form of disagreement is to refute someone's central point.


Even as high as DH5 we still sometimes see deliberate dishonesty, as when someone picks out minor points of an argument and refutes those. Sometimes the spirit in which this is done makes it more of a sophisticated form of ad hominem than actual refutation. For example, correcting someone's grammar, or harping on minor mistakes in names or numbers. Unless the opposing argument actually depends on such things, the only purpose of correcting them is to discredit one's opponent.
Truly refuting something requires one to refute its central point, or at least one of them. And that means one has to commit explicitly to what the central point is. So a truly effective refutation would look like:
The author's main point seems to be x. As he says:
<quotation>
But this is wrong for the following reasons...
The quotation you point out as mistaken need not be the actual statement of the author's main point. It's enough to refute something it depends upon.


What It Means

Now we have a way of classifying forms of disagreement. What good is it? One thing the disagreement hierarchy doesn't give us is a way of picking a winner. DH levels merely describe the form of a statement, not whether it's correct. A DH6 response could still be completely mistaken.


But while DH levels don't set a lower bound on the convincingness of a reply, they do set an upper bound. A DH6 response might be unconvincing, but a DH2 or lower response is always unconvincing.




The most obvious advantage of classifying the forms of disagreement is that it will help people to evaluate what they read. In particular, it will help them to see through intellectually dishonest arguments. An eloquent speaker or writer can give the impression of vanquishing an opponent merely by using forceful words. In fact that is probably the defining quality of a demagogue. By giving names to the different forms of disagreement, we give critical readers a pin for popping such balloons.


Such labels may help writers too. Most intellectual dishonesty is unintentional. Someone arguing against the tone of something he disagrees with may believe he's really saying something. Zooming out and seeing his current position on the disagreement hierarchy may inspire him to try moving up to counterargument or refutation.



But the greatest benefit of disagreeing well is not just that it will make conversations better, but that it will make the people who have them happier. If you study conversations, you find there is a lot more meanness down in DH1 than up in DH6. You don't have to be mean when you have a real point to make. In fact, you don't want to. If you have something real to say, being mean just gets in the way.

If moving up the disagreement hierarchy makes people less mean, that will make most of them happier. Most people don't really enjoy being mean; they do it because they can't help it.
By Paul Graham


Thanks to Trevor Blackwell and Jessica Livingston for reading drafts of this.
The hierarchy of disagreement is a concept proposed by computer scientist Paul Graham in his 2008 essay How to Disagree.[1]

Graham's hierarchy has seven levels, from name calling to "Refuting the central point". According to Graham, most disagreements come on one of seven levels:
1: Refuting the central point (explicitly refutes the central point).
2: Refutation (finds the mistake and explains why it's mistaken using quotes).
3: Counterargument (contradicts and then backs it up with reasoning and/or supporting evidence).
4: Contradiction (states the opposing case with little or no supporting evidence).
5: Responding to tone (criticizes the tone of the writing without addressing the substance of the argument.)
6: Ad Hominem (attacks the characteristics or authority of the writer without addressing the substance of the argument).
7: Name-calling (sounds something like, "You are an idiot.").
References from Rationalwiki
 
Last edited:

I told you I was trouble

Suspended animation
Give it a break MB ... you don't post (here) with true respect. You were asked very nicely many times by quite a few people when you first arrived on ESMB (and had only just left the cult) to join in with the chat and consider posting style. You do tend to post as if you are lecturing to the great unwashed, even our mod at the time told you to stop it ... your response was to fire back, write nasty posts about ESMB elsewhere and add a snotty signature line.




:pfft:
 

mockingbird

Silver Meritorious Patron
Yes, I like technical details BUT a summary is needed. :)
This is tough. In coming out of Scientology I came to believe in a profound difference between having an opinion and having an educated opinion. Scientology used appeal to authority and the alleged expertise of Hubbard to support his claims.

In looking at the psychology of belief and rhetoric I came to believe that most of the beliefs we hold really are not formed by careful analysis of evidence and arguments. Instead we usually believe things based on if they are not a challenge to our identity or beliefs and if we are indoctrinated with them as children and maintain them through adolescence with our peers and then as adults if people we consider like ourselves believe something we generally do as well.

In leaving Scientology I ended up reading a lot about the history of experiments in psychology and what were good or bad experiments and which had variations or other experiments to verify or falsify or clarify findings. So, I took on the mindset that I had to examine a lot of evidence and get enough education to really have a lot of understanding to support a belief. The beliefs we have for things too complex for most of us to understand are ones that aren't the same as ones for things we personally know.

We can know that two plus two results in four by taking two items and adding two more and counting the four items. But I don't have enough education to evaluate the truth of any claim about hundreds of things like the speed of light, the size of the universe, the age of the universe and much more. Now, usually people just believe the ideas that their peers believe but unless they have a lot of education they are no more qualified to know these things than me.

So, I try to present enough evidence to support most of the ideas, especially if they are scientific, for someone to evaluate the evidence. That way you don't have to rely on my opinion, you can look at the evidence and if you are interested look into it further.

If I just have my opinions as summaries that would be less helpful in my opinion than presenting supporting evidence.

I have read dozens of books on psychology, neuroscience, cults and related topics.

I have found that books that only give conclusions and opinions as summaries are essentially useless to me.
 

freethinker

Sponsor
It seems like you are picking a scab and hoping that one day when you do, it won't bleed this time. Try dropping the whole subject for a long period of time to let the wound heal. If yo still want to pick the scab after that, I don't know what to tell you but you'll be spending a lifetime doing this.

If you want to know why you fell into Scientology, it's because you believed him. If you want to know why you believed him, it's because you wanted to.

Then one day you didn't want to anymore.
 
Top