Monday, February 25, 2008

Mind Control - The BITE Model

From chapter two of Releasing the Bonds: Empowering People to Think for Themselves*
*© 2000 by Steven Hassan; published by Freedom of Mind Press, Somerville MA
Destructive mind control can be understood in terms of four basic components, which form the acronym BITE:
I. Behavior Control
II. Information Control
III. Thought Control
IV. Emotional Control
It is important to understand that destructive mind control can be determined when the overall effect of these four components promotes dependency and obedience to some leader or cause. It is not necessary for every single item on the list to be present. Mind controlled cult members can live in their own apartments, have nine-to-five jobs, be married with children, and still be unable to think for themselves and act independently.

I. Behavior Control

1) Regulation of individual's physical reality
a) Where, how and with whom the member lives and associates with
b) What clothes, colors, hairstyles the person wears
c) What food the person eats, drinks, adopts, and rejects
d) How much sleep the person is able to have
e) Financial dependence
f) Little or no time spent on leisure, entertainment, vacations

2) Major time commitment required for indoctrination sessions and group rituals
3) Need to ask permission for major decisions
4) Need to report thoughts, feelings and activities to superiors
5) Rewards and punishments (behavior modification techniques- positive and negative).
6) Individualism discouraged; group think prevails
7) Rigid rules and regulations
8) Need for obedience and dependency

II. Information Control

1) Use of deception
a) Deliberately holding back information
b) Distorting information to make it acceptable
c) Outright lying

2) Access to non-cult sources of information minimized or discouraged
a) Books, articles, newspapers, magazines, TV, radio
b) Critical information
c) Former members
d) Keep members so busy they don't have time to think

3) Compartmentalization of information; Outsider vs. Insider doctrines
a) Information is not freely accessible
b) Information varies at different levels and missions within pyramid
c) Leadership decides who "needs to know" what

4) Spying on other members is encouraged

a) Pairing up with "buddy" system to monitor and control
b) Reporting deviant thoughts, feelings, and actions to leadership

5) Extensive use of cult generated information and propaganda
a) Newsletters, magazines, journals, audio tapes, videotapes, etc.
b) Misquotations, statements taken out of context from non-cult sources

6) Unethical use of confession
a) Information about "sins" used to abolish identity boundaries
b) Past "sins" used to manipulate and control; no forgiveness or absolution

III. Thought Control

1) Need to internalize the group's doctrine as "Truth"
a) Map = Reality
b) Black and White thinking
c) Good vs. evild

d) Us vs. them (inside vs. outside)

2) Adopt "loaded" language (characterized by "thought-terminating clich├ęs"). Words are the tools we use to think with. These "special" words constrict rather than expand understanding. They function to reduce complexities of experience into trite, platitudinous "buzz words".
3) Only "good" and "proper" thoughts are encouraged.
4) Thought-stopping techniques (to shut down "reality testing" by stopping "negative" thoughts and allowing only "good" thoughts); rejection of rational analysis, critical thinking, constructive criticism.
a) Denial, rationalization, justification, wishful thinking
b) Chanting
c) Meditating
d) Praying
e) Speaking in "tongues"
f) Singing or humming

5) No critical questions about leader, doctrine, or policy seen as legitimate
6) No alternative belief systems viewed as legitimate, good, or useful

IV. Emotional Control

1) Manipulate and narrow the range of a person's feelings.
2) Make the person feel like if there are ever any problems it is always their fault, never the leader's or the group's.
3) Excessive use of guilt

a) Identity guilt

i) Who you are (not living up to your potential)
ii) Your family
iii) Your past
iv) Your affiliations
v) Your thoughts, feelings, actions

b) Social guilt
c) Historical guilt

4) Excessive use of fear
a) Fear of thinking independently
b) Fear of the "outside" world
c) Fear of enemies
d) Fear of losing one's "salvation"
e) Fear of leaving the group or being shunned by group
f) Fear of disapproval

5) Extremes of emotional highs and lows.
6) Ritual and often public confession of "sins".
7) Phobia indoctrination : programming of irrational fears of ever leaving the group or even questioning the leader's authority. The person under mind control cannot visualize a positive, fulfilled future without being in the group.
a) No happiness or fulfillment "outside"of the group
b) Terrible consequences will take place if you leave: "hell"; "demon possession"; "incurable diseases"; "accidents"; "suicide"; "insanity"; "10,000 reincarnations"; etc.
c) Shunning of leave takers. Fear of being rejected by friends, peers, and family.
d) Never a legitimate reason to leave. From the group's perspective, people who leave are: "weak;" "undisciplined;" "unspiritual;" "worldly;" "brainwashed by family, counselors;" seduced by money, sex, rock and roll.

Sunday, February 24, 2008

Definition and Explanation of the Word “Cult”

A cult can be either a sharply-bounded social group or a diffusely-bounded social movement held together through shared commitment to a charismatic leader. It upholds a transcendent belief system (often but not always religious in nature) that includes a call for a personal transformation. It also requires a high level of personal commitment from its members in words and deeds.

This definition is not meant to be evaluative in the sense of implying that a group is good, bad, benign, or harmful. Rather it is meant to convey a systemic view of such a group, which is comprised of a charismatic relationship, a promise of fulfillment, and a methodology by which to achieve it.

Cults differ in their specific ruling ideologies and in their specific requirements, practices, and behaviors; a single group may even differ over its lifetime or across different locations. These groups exist on a continuum of influence (regarding a particular group’s effect on its members and on society, and vice versa) and a continuum of control (from less invasive to all-encompassing).

Cults can be distinguished from other non-mainstream groups—for example, religious or political sects, fringe or alternative groups or movements, communes and intentional communities—because of their intense ideologies and their demand for total commitment from at least some of the members. Each group must be observed and judged on its own merits and its own practices and behaviors as to whether it falls within this category type, which is not meant to be dismissive or one-sided.

Cults are frequently totalistic and separatist. Some cults are totalistic when they are exclusive in their ideology (sacred, the only way) and impose upon their members systems of social control that are confining and all-inclusive (encompassing all aspects of life). Some cults are separatist when they promote withdrawal from the larger society.

People in such cults tend to

  1. Espouse an all-encompassing belief system
  2. Exhibit excessive devotion to and dependency on their “perfect” leader
  3. Avoid criticism of the group, its leader(s), and its practices
  4. Have an attitude of disdain for non-members

Frequently, the totalistic and separatist features of some cults makes them appear alien and threatening, and those features have attracted great attention in the mass media.

This is drawn from Bounded Choice: True Believers and Charismatic Cults by Janja Lalich (University of California Press, 2004). Copyright 2004.

How Cults Manipulate a Person's Thinking and Behavior

Many people who have been subjected to psychological manipulation and control selectively deny aspects of their experience. Some become angry and resistant at the mention of mind control, thought reform, or brainwashing, thinking these things could not possibly have been done to them. It is very threatening to a person's sense of self to contemplate having been controlled or taken over. The terms themselves sound harsh and unreal. Yet only by confronting the reality of psychological manipulation can someone who has had such an experience overcome its effects.

Deceptive psychological and social manipulation are part and parcel of the totalist experience. Over the years various labels have been used to describe this systematic process. Psychiatrist Robert Jay Lifton first used the term thought reform in the 1950s to describe the behavioral change processes he observed and studied at the revolutionary universities in Communist China and in prisoners of war during the Korean War. Lifton outlined the psychological techniques used to impose what he calls a state of Aideological totalism the process of the coming together of the individual self and certain ideas, or the melding of the individual with a particular set of beliefs. Through his research, Lifton came to the conclusion that within each person there is the potential for a all-or-nothing emotional alignment. The process of combining this human tendency with an all-or-nothing ideology (usually about one's relationship to the world and sometimes to the spiritual realm) results in totalism. It's a rather surefire formula: immoderate individual character traits plus an immoderate ideology equals totalism a world of extremes. And, says Lifton, Awhere totalism exists, a religion, a political movement, or even a scientific organization becomes little more than an exclusive cult.

Lifton identified eight psychological themes , now widely used as the criteria for evaluating whether or not a particular group uses thought reform. The more these themes are present, the more restrictive the group or system and the more effective the thought-reform program. Each theme requires central control and sets off a predictable cycle:

  1. The theme sets the stage.
  2. The rationale for the theme is based on an absolute belief or philosophy.
  3. Because of the extreme belief system, a person within this setting has a conflicting and polarized reaction, and is forced to make a choice.
  4. Enveloped in such a totalistic environment, most individuals will make totalistic choices.

The outcome of this psychological interplay is thought reform - that is, the person is changed.

Eight Psychological Criteria for a Thought-Reform System

This list is based on the work of Robert Jay Lifton, M.D., author of Thought Reform and the Psychology of Totalism (W.W. Norton, 1961).

  • MILIEU CONTROL: The group controls all communication and information, which includes the individual’s communication with him-self. This sets up what Lifton calls Apersonal closure, meaning that the person no longer has to carry on inner struggles about what is true or real. Essentially, this prevents any time being spent on doubts.
  • MYSTICAL MANIPULATION: There is a claim of authority (divine, supernatural, or otherwise), which allows for the rationale that the end justifies the means since the end is directed by a higher purpose. Certain experiences are orchestrated to make it seem as though they occur spontaneously. The person is required to subordinate her-self or him-self to the group or cause, and stops all questioning of who can question Ahigher purpose? Self-expression and independent action wither away.
  • DEMAND FOR PURITY: The system puts forth a black-and-white world view with the leader as the ultimate moral arbiter. This creates a world of guilt and shame, where punishment and humiliation are expected. It also sets up an environment of spying and reporting on one another. Through submission to the powerful lever of guilt, the individual loses his or her personal sense of morality.
  • CULT OF CONFESSION: First one, then many acts of surrender, of total exposure are necessary. The individual is now owned by the group. The person no longer has a sense of balance between worth and humility, and there is a loss of boundaries between what is secret (known only to the inner self) and what is known by the group.
  • SACRED SCIENCE: The group’s doctrine is seen as the Ultimate Truth. Questions or challenges are not allowed. This reinforces personal closure this inhibits individual thought, creative self-expression, and personal development. Experience can be perceived only through the filter of the dogmatic belief system or ideological trappings.
  • LOADING THE LANGUAGE: There is jargon internal to and understood by only the group. Constricting language constricts the person. Capacities for thinking and feeling are significantly reduced. Imagination is no longer a part of the person’s actual life experiences; the mind atrophies from disuse.
  • DOCTRINE OVER PERSON: Denial of self and any perception other than the groups is required. There is no longer such a thing as personal reality, or a self separate from the group. The past societies and the individual’s are altered to fit the needs of the doctrine. Thus, the individual is remolded, the cult personality emerges, and the person’s sense of integrity is lost.
  • DISPENSING OF EXISTENCE: The group is the ultimate arbiter, and all nonbelievers are considered evil, or non-people. If non-people cannot be recruited, they can be punished, even killed. This creates an us versus them mentality and breeds fear in the individual’s who sees that one’s own life depends on a willingness to obey. Here is found the merger of the individual with the belief.

Adapted from Take Back Your Life by Janja Lalich & Madeleine Tobias (Bay Tree Publishing, 2006).

Saturday, February 23, 2008

Thought Reform Exists: Organized, Programmatic Influence

("Thought Reform" throughout this article can be read as synonymous with "Brainwashing" & "Coercive Persuasion".)

Margaret Thaler Singer, Ph.D.

Recently, cult apologists have attempted to create the impression that the concept of thought reform has been rejected by the scientific community. This is untrue.

As recently as May of this year, the new Diagnostics and Statistical Manual of Mental Disorders (DSM-IV) published by the American Psychiatric Association cites thought reform as a contributing factor to "Dissociative Disorder Not Otherwise Specified" (a diagnosis frequently given to former cult members). Thought reform (notes 1,2,3 below) and its synonyms brainwashing and coercive persuasion (4.5) were also noted in DSM-III (1980) and is DSM-III revised (1987), as well as in widely recognized medical texts (6.7).

Thought reform is not mysterious. It is the systematic application of psychological and social influence techniques in an organized programmatic way within a constructed and managed environments (5,7,8,9,10). The goal is to produce specific attitudinal and behavioral changes. The changes occur incrementally without its being patently visible to those undergoing the process that their attitudes and behavior are being changed a step at a time according to the plan of those directing the program.

In society there are numerous elaborate attempts to influence attitudes and modify behavior. However, thought reform programs can be distinguished from other social influence efforts because of their totalistic scope and their sequenced phases aimed at destabilizing participants' sense of self, sense of reality, and values. Thought reform programs rely on organized peer pressure, the development of bonds between the leader or trainer and the followers, the control of communication, and the use of a variety of influence techniques. The aim of all this is to promote conformity, compliance, and the adoption of specific attitudes and behaviors desired by the group. Such a program is further characterized by the manipulation of the person's total social environment to stabilize and reinforce the modified behavior and attitude changes. (8,9,10)

Thought reform is accomplished through the use of psychological and environmental control processes that do not depend on physical coercion. Today's thought reform programs are sophisticated, subtle, and insidious, creating a psychological bond that in many ways is far more powerful than gun-at-the-head methods of influence. The effects generally lose their potency when the control processes are lifted or neutralized in some way. That is why most Korean War POWs gave up the content of their prison camp indoctrination programs when they came home and why many cultists leave their groups if they spend a substantial amount of time away from the group or have an opportunity to discuss their doubts with in intimate (11).

Contrary to popular misconceptions (some intentional on the part of naysayers), a thought reform program does not require physical confinement and does not produce robots. Nor does it permanently capture the allegiance of all those exposed to it. In fact, some persons do not respond at all to the programs, while others retain the contents for varied periods of time. In sum, thought reform should be regarded as "situationally adaptive belief change that is not subtle and is environment-dependent". (8,10)

The current effort by cult apologists to deny thought reform exists is linked to earlier protective stances toward cults in which apologists attempted to deny the cults' active and deceptive recruitment practices, deny the massive social, psychological, financial, spiritual and other controls wielded by cult leaders and thus dismiss their often destructive consequences.

These earlier efforts to shield cults from criticism rest on a seeker theory of how people get into cults, which overlooks the active and deceptive tactics that most cults use to recruit and retain members. When bad things happened to followers of Jim Jones or David Koresh, the twisted logic of some apologists implied that these "seekers" found what they wanted, thus absolving the cult leader and his conduct.

Finally, to promulgate the myth that though reform has been rejected by the scientific community, cult apologists doggedly stick to faulty understanding of the process contrary to findings in the literature, they ---- that physical coercion and debilitation are necessary for thought reform to occur, and that the effects of thought reform must be instant, massive, uniform, universally responded to, and enduring.

The recent upholding of thought reform in DSM-IV is but one more piece of evidence that this orchestrated process of exploitative psychological manipulation is real and recognized within the professional psychiatric field. To say then that the concept of thought reform is rejected by the scientific community is false and irresponsible. The phenomenon has been studied and discussed since 1951, and continuing studies by social psychologists and other behavioral scientists have solidified our understandings of its components and overall impact.

© 1994 M.T. Singer {The Cult Observer, Vol.11, No.6 (1994): 3-4.}

This table is from Cults In Our Midst

Table 3.2. Continuum of Influence and Persuasion





Thought Reform

Focus of body of knowledge

Many bodies of knowledge, based on scientific findings in various fields.

Body of knowledge concerns product, competitors; how to sell and influence via legal persuasion.

Body of knowledge centers on political persuasion of masses of people.

Body of knowledge is explicitly designed to inculcate organizational values.

Body of knowledge centers on changing people without their knowledge.

Direction & degree of exchange

Two way pupil-teacher exchange encouraged.

Exchange can occur but communication generally one-sided.

Some exchange occurs but communication generally one-sided.

Limited exchange occurs, communication is one-sided.

No exchange occurs, communication is one-sided.

Ability to change

Change occurs as science advances; as students and other scholars offer criticisms; as students & citizens evaluate programs.

Change made by those who pay for it, based upon the success of ad programs by consumers law, & in response to consumer complaints.

Change based on changing tides in world politics and on political need to promote the group, nation, or international organization.

Change made through formal channels, via written suggestions to higher-ups.

Change occurs rarely; organization remains fairly rigid; change occurs primarily to improve thought-reform effectiveness.

Structure of persuasion

Uses teacher-pupil structure; logical thinking encouraged.

Uses an instructional mode to persuade consumer/buyer.

Takes authoritarian stance to persuade masses.

Takes authoritarian & hierarchical stance.

Takes authoritarian & hierarchical stance; No full awareness on part of learner.

Type of relationship

Instruction is time-limited: consensual.

Consumer/buyer can accept or ignore communication.

Learner support & engrossment expected.

Instruction is contractual: consensual

Group attempts to retain people forever.


Is not deceptive.

Can be deceptive, selecting only positive views.

Can be deceptive, often exaggerated.

Is not deceptive.

Is deceptive.

Breadth of learning

Focuses on learning to learn & learning about reality; broad goal is rounded knowledge for development of the individual.

Has a narrow goal of swaying opinion to promote and sell an idea, object, or program; another goal is to enhance seller & possibly buyer.

Targets large political masses to make them believe a specific view or circumstance is good.

Stresses narrow learning for a specific goal; to become something or to train for performance of duties.

Individualized target; hidden agenda (you will be changed one step at a time to become deployable to serve leaders).


Respects differences.

Puts down competition.

Wants to lessen opposition.

Aware of differences.

No respect for differences.


Instructional techniques.

Mild to heavy persuasion.

Overt persuasion sometimes unethical.

Disciplinary techniques.

Improper and unethical techniques.


  1. Lifton, R.J. (1961). Thought Reform and the Psychology of Totalism. New York: W.W. Norton. (Also: 1993, University of North Carolina Press.)

  2. Lifton, R.J. (1987). Cults: Totalism and civil liberties. In R.J. Lifton, The Future of Immortality and Other Essays for a Nuclear Age. New York: Basic Books.

  3. Lifton, R.J. (1991, February). Cult formation. Harvard Mental Health Letter.

  4. Hunter, E. (1951). Brainwashing in China. New York: Vanguard.

  5. Schein, E.H. (1961). Coercive Persuasion. New York: W. W. Norton.

  6. Singer, M.T. (1987). Group psychodynamics. In R. Berkow (Ed.). Merck Manual, 15th ed. Rahway, NJ: Merck, Sharp, & Dohme.

  7. West, L.J., & Singer, M.T. (1980). Cults, quacks, and nonprofessional psychotherapies. In H.I. Kaplan, A.M. Freedman, & B.J. Sadock (Eds.), Comprehensive Textbook of Psychiatry III, 3245-3258. Baltimore: Williams & Wilkins.

  8. Ofshe, R., & Singer, M.T. (1986). Attacks on peripheral versus central elements of self and the impact of thought reforming techniques. Cultic Studies Journal. 3, 3-24.

  9. Singer. M.T. & Ofshe, R.(1990) Thought reform programs and the production of psychiatric casualties. Psychiatric Annals, 20, 188-193

  10. Ofshe, R. (1992). Coercive persuasion and attitude change. Encyclopedia of Sociology. Vol. 1, 212-224. New York: McMillan.

  11. Wright, S. (1987) Leaving Cults. The Dynamics of Defection. Society for the Scientific Study of religion. Monograph no. 7, Washington, DC.

Go To YATTT Home Page