Razor’s Edge Indeed: A Deprogrammer’s View of Harmful Cult Activity
This paper is an effort to describe a model of cult behavior that helps the author, a cult interventionist or deprogrammer, to assess a cult as well as to help a client assess what he means by harmful cult activity. The author based his observations on more than two decades of working with ex-members and helping with many hundreds of interventions that addressed a wide variety of cult behaviors. Four facets of cult formation are at the core of the model: Transpersonal attraction, Exclusive leadership, Circular tension, and Exit perils. The paper also offers a model of a closed cult system with more potential for harm contrasted to a more open cult system that incorporates more reasonable and democratic elements and practically eliminates totalist control of leaders and doctrines. In conclusion, the paper argues that a deprogrammer’s goal is to assist the cult member to arrive at eye-level with reality and the leader, thus creating a rational possibility for criticism and choice.
Never use that term to me again.... I am an exit counselor for victims of mind control...
A character named Diamond (actor James Earl Jones) spoke those words in a British film called Signs and Wonders, released in 1994. The film was significant because it was the first time, to my knowledge, that a major film production about cults and deprogramming made the distinction between deprogrammer and exit counselor. Deprogramming, as it appeared in the English language in the 1970s, referred to actions taken to persuade a person to abandon allegiance to a controversial group or cult. The neologism exit counseling appeared in the early 1980s to distinguish coercive and oppositional deprogramming models of cult intervention from that of a non-coercive, educational approach.
In the film, a middle-aged housewife and mother, Elizabeth (actor Prunella Scales), comes to America from England to meet Diamond. Elizabeth hired Diamond to convince her estranged daughter to leave a controversial group led by an Asian messiah. While Diamond drives Elizabeth to a hotel, he questions her perception of what he does and asks her who is likely to join a cult. The perplexed mother says that Diamond is a deprogrammer and she blames herself and her husband for not raising their daughter well enough.
The car screeches to a halt as Diamond pulls over to park it by a curb. He orders his client to get out of the vehicle. He stands up to the bewildered woman, inches from her face, to ask her what her people told her that he does. She says that he is a “deprogrammer who gets people out of cults.” Not so clear to her is whether he will resort to kidnapping. Diamond berates her for implying that he is a deprogrammer: “Never use that term to me. I am an exit counselor,” he insists. Then he aggressively explains how he educates people using highly ethical standards. Utter irony prevails here because Diamond’s team has already kidnapped Elizabeth’s daughter, whom they have in a secret location, and he knows it.
You will have to see the film to appreciate the intricate subplots that intertwine Elizabeth’s other concerns for her frustrated Anglican minister husband who drinks too much and her adult son who is infatuated with an egotistical philosopher of deconstruction. My concern here is with Diamond’s dilemma. After one of Diamond’s assistants infiltrates the cult, the team determines that there is no other way but kidnapping to help Elizabeth’s daughter, who is already on her way from the United Kingdom. Diamond’s team abducts the young woman by forcing her into a van from a sidewalk where she was fundraising for the cult.
The old deprogramming stereotype in films about cult intervention since 1980 is all we see again. Noncoercive interventions are too boring for the film industry and could hurt ratings. In any case, the “ethical” exit counselor Diamond is in an old dilemma: There is no happy alternative to the stereotype if he and his team are to spare the daughter from coming under the direct control of a leader known to have sex with vulnerable female members. And who can resist a wonderful mother who borrowed money to save her daughter? I can sympathize with Diamond because I’ve been in that very situation many times. But Diamond’s chiding of Elizabeth has merit. Most deprogramming interventions, and nearly all since 1992 in America, have been open meetings that the cult member can choose to end and leave at any time.
The confusion regarding deprogramming and exit counseling among the general public extends to a definition of cult. I often hear people use the “c” word as if they know what it means. Upon questioning, the average person entertains a basic notion of cult as a weird, possibly dangerous group with weird rituals and weird people. Less-common yet persistent perceptions are associations with Satanism, witchcraft, and demon possession. Dictionaries are of some help, but most indicate that cult is an intense devotional system directed toward a person, idea, or object. That definition covers a wide, somewhat innocuous spectrum of religious activity. The definition also includes “spurious group,” which is how most folks understand the term. A lesser definition alludes to cult as a healing system based on someone’s dogma, or to an alternative treatment, as in shamanic healing. The International Cultic Studies Association adopted a definition that fits its purposes, to describe manipulative groups that harm some or most members:
Cult (totalist type): A group or movement exhibiting a great or excessive devotion or dedication to some person, idea, or thing and employing unethically manipulative techniques of persuasion and control (e.g., isolation from former friends and family, debilitation, use of special methods to heighten suggestibility and subservience, powerful group pressures, information management, suspension of individuality or critical judgment, promotion of total dependency on the group and fear of leaving it, etc.), designed to advance the goals of the group’s leaders, to the actual or possible detriment of members, their families, or the community.
Note that in this paper I may use cult in its lesser definition as adopted by ICSA although that definition violates the primarily neutral, academic application of the word.
Following, I propose a working model of the totalist cult as the term applies to what I do as a deprogrammer or exit counselor. I often use deprogramming to cover all types of intervention with cult members, including the noncoercive, open sessions that I now use exclusively, and have used especially since 1992. I realize that use of this term might irritate my exit-counselor colleagues who have fought long and hard to set a standard name for noncoercive approaches. Like cult, the term deprogrammer has popular pejorative implications. As the lady who hired Diamond in the film I mentioned above illustrates, most people unfortunately continue to relate to deprogrammer and not to exit counselor, and even much less so to the more obscure thought reform consultant term that a few of my colleagues have adopted.
Unlike established psychotherapies that enjoy the support of university degrees and professional licenses, exit counseling by any name has no specific support or professional validation. Anyone can call himself an exit counselor or deprogrammer because the field has no enforceable professional standards. Nevertheless, anyone serious about exit counseling as a vocation should have an educated interest in social psychology, comparative religion, and family counseling. Perhaps it is ironic that controversial maverick movements attract maverick remedies. Based on my extensive experience in the field since 1980, my hope with this paper is to add to a professional and general understanding of what has developed as the purpose and process of deprogramming. To put this in context, I wish to introduce a four-point model of what I recognize as harmful enough cult activity for me to agree, as a deprogrammer, to intervene.
Facets of potentially harmful cult activity as a closed system include:
Exit perilsBefore I describe what I mean by an unhealthy cult, note that I believe that being in a cult can involve healthy devotional activity. Healthier cults remain open to a wider frame of social moderation and reference. By any definition, a typical Catholic monastery or cloistered order of nuns qualifies as cultic; but as a result of the wider moderation of a less radical authority, these devotional organizations tend toward stability and have checks and balances. Moderation for monasteries flows from a presiding bishop and, in the case of nuns, “outsider” priests as spiritual directors and confessors under the even broader authorities of the Vatican and the surrounding secular government. Healthy monasteries abide by local laws. One could argue the same for healthier Buddhist, Sufi, and Hindu orders based on more intense devotional systems than expressed by the overall religious culture in which these orders occur.
In healthy cults, the members can appeal to authorities outside of the leader to moderate the leader’s power and to dismiss the leader, if necessary. The group’s transcendental purpose or organizing principle is not identified with the leader or totally controlled by the leader for life. For example, in my embattled tradition of Catholicism, the pope, especially since the Vatican II clarifications four decades ago, is no more essentially holy or “saved” than any other Christian. The tradition reveres many “saints in heaven” who were in life persecuted by the very Church they served. I view this as a healthy feature of the Catholic “cult of the pope” because common folks share equally with popes in the transcendent principle as held in the Catholic “cult of the Eucharist.”
In secular society, devotional cults form around sports teams, and these cults have some radical fans. The moderating influence of the team and the surrounding society prevents a radical fan from controlling the team and most of the other fans. In harmful cults that operate within self-sealing or closed systems, the moderating influences fade as effective social and psychological controls over the power and often malignant narcissism of the leaders [see Figure 5 later in this article for the “Cult (healthy type)” description].
The model I propose borrows heavily from many earlier approaches. For references, I recommend Bounded Choice (Lalich, 2004), Them and Us (Deikman, 2003), Thought Reform and the Psychology of Totalism (Lifton, 1989), Cults in Our Midst (Singer with Lalich, 2003), Releasing the Bonds (Hassan, 2000), and Brainwashing (Taylor, 2004). Also, Zablocki’s work on “exit costs” and brainwashing in Misunderstanding Cults (Zablocki and Robbins, 2001) is worthwhile reading. In borrowing from the named authors, I lay no claim to representing their ideas with mine. Each approach stands or falls on its own merits, although all interrelate. My perspective comes from practice and direct observation more than from any model proposed by the social science community.
With my model, I do not impugn every group that has a closed milieu; but history and experience tell me that the more closed a social system becomes, the greater the potential for deceit and abuse of power. In concert, the four elements, or facets, noted above create a matrix or process for some degree of potentially harmful cult activity. Each element is a red flag, so to speak. If all four appear as described, then the red flags should be waving. If there is harm, the degree of harm can be subjective, objective, or both. Subjective harm includes how much an ex-member has lost in perception, perspective, and self-esteem. Objective harm concerns loss in investments, health, relationships, education, and employment. While some former cult members have to start over alone and broke, others have careers and families intact. In every case, what I look for as a deprogrammer before I deign to cut short a true believer’s cult membership is reflected in the following model. Although I parsed the facets or elements to four, they could easily extend to eight or sixteen; but experience with audiences has taught me to economize any definition of cult and to elaborate from there.
A person might be happy or sad, anxious or mellow. He might be seeking the truth or seeking nothing at all. He encounters a philosophy, group, or teacher that suggests or even confirms that his performance, self-concept, or ego needs improvement, purging, or perhaps extinction so that he can better serve a grand purpose. The encounter unnerves him yet makes him curious. He might experience, feel, or dream something mystical that magnifies the transcendent encounter. He feels taken beyond his sense of identity as a person into a transpersonal awareness.
The new path promises a way to eternal salvation, permanent health, great fortune, or ultimate freedom. Possibilities emerge—to control sacred territory, to reduce global warming, or to gain political protection for migrant workers. He becomes intrigued with saving the planet from sin or karmic retribution, tapping latent powers of the mind to heal self and others, or simply to help a personal partner to make his or her dream of any of the above come true.
The seeker soon learns that his self-concept and achievements are not good enough. He must change. He must transcend the limits imposed by society, religion, and family. A call for change is always a risk, but perhaps no more a risk than a refusal to change. We all know that, but what we know to do about it is another matter. Totalist cults seek to exploit our confusion and ambivalence about “what to do” with core insights, intriguing answers, and promising techniques.
To achieve this new and interesting purpose, the seeker realizes that he will have to make changes. The new system or authority figure will help him transcend limitations with a variety of techniques that can include confession and intimate self-revelations, chanting of prayers or mantras, transformative workshops or intensives, submission or contract requirements, outdoor survival games, initiation rituals, ritualistic breathing, trance dancing, fire walking, sweat lodges, secret ordeals, changes in appearance and name, and so on. The idea here is akin to tapping the potential of the caterpillar to accomplish its transformation into a butterfly—the process of changing from an earthbound slug to a free and heavenly angel. But first the seeker must enter the cult’s cocoon.
The cult member in this model is proverbially stuck in the chrysalis stage, seeking to transcend the normal, boring, or limited self. The cult doctrine and leader will tell him that without the effort to transmute the self, his service to the purpose cannot gain power and reach perfection. But who knows how much effort is sufficient? By what evidence is the devotee free, saved, or enlightened? Does anyone in the group ever get to fly? Who really benefits? Is the member a chrysalis, or merely a bug wrapped in a spider’s cocoon? Is he merely fodder for a predatory leader and a parasitic system?
The group may use archetypal models to encourage him. “Jesus and Buddha sacrificed all, even family, to serve, did they not? Jesus gave his very life. What are you doing with yours?” Stories about St. Francis of Assisi, Mother Teresa, and Milarepa underscore the sacrificial life of heroes. Jesus fasted for forty days and Buddha practically starved himself to death before they each acted on a purpose. St. Francis gave up a fortune to better serve his divine inspiration. Milarepa worked his butt off building and rebuilding a stone house many times for seemingly no reason to obey his Buddhist guru who promised to teach him ‘when the student was ready.’
The group will suggest that members can access the transpersonal and true self, perhaps more quickly, through altered states of consciousness. Drugs or entheogens are useful for some groups, but most never use mind-altering substances. There are better ways to induce a sense of ecstasy. As a rule, cult members function best as deployable agents without drug influence, whether or not the leaders find personal exceptions to this rule. The techniques already mentioned have the secondary effect of creating intimate bonds with the group. Ingeniously, this seemingly spontaneous satori or euphoria evokes appreciation for and devotion to the group and leader that made it possible. Robert J. Lifton called this mystical manipulation.
After an ecstasy (to stand outside oneself or seem out of place), the new devotee looks to the group for interpretation and support. Most cults realize this, so they will link him with a contact person or to the leader’s explanation. He becomes a potential cult recruit if he has an unsettling ecstasy independent of a group or leader. The spiritual experience may seem to come from nowhere, or occur during an individual crisis caused by illness, environmental catastrophe, or even incarceration. A weird epiphany can cause him to seek a system or group for context, to help explain, contain, and sustain the experience. Without context, the ecstasy might feel like a psychotic break.
A cult leader might suggest that the member is not insane but merely having a spiritual emergency, a karmic cleansing, or kundalini experience. However, as with hypnotic suggestions, any resulting insight must be reinforced or it will fade. Reinforcement requires bonds to a sympathetic group. The group recognizes and approves the transpersonal experience, and it offers an opportunity for a purposeful life with that experience. The new member cannot see that he may have just followed a white rabbit down the hole into Wonderland. Once he enters that new, mysterious world, the bonds are reinforced and even tightened through the next three elements.
Charisma attends leaders whether they have one follower or millions. No person has more or less charisma. Rather, charisma depends on a given relationship. One neurotic woman might experience a high level of charisma with a narcissistic man whom everyone else finds obnoxious. Ten thousand fans at a music concert might feel a charismatic attraction to one lead singer, even as a few fans walk out in disgust. For example, Jim Morrison of The Doors had a dark charisma when I first saw the band appear on a Philadelphia stage in 1967—or was it 1966? The entire audience, as I recall, was transfixed, even mesmerized during the concert. I was, too. The talented band was at the top of its game in those years.
A year or two later, I slept through an entire Doors concert. Admittedly, I was tired from lack of sleep and no lack of wine and cannabis, but I was among thousands of cheering fans, again in Philadelphia. The pounding drums had no effect on me as I slumped in my stadium seat. One of my friends woke me when it was over, and I became the brunt of their jokes on the way home. In any case, for me, the band’s charisma had worn off. When I first encountered him, Jim Morrison intrigued me as the dark shaman on stage with his mystical lyrics; but later, he was merely another good entertainer whose stage antics turned stale. His was a cult of personality driven by the ephemeral spirit of the sixties.
The charisma of a guru or transformational group leader is different than that of a music idol because the individual’s purpose is different. For example, gurus with an elitist cult following often require an intense transformation of self. The purpose is to acquire eternal freedom or salvation, and not to provide mere temporal entertainment. However, charisma exists in a relationship of perceived qualities. Although the group or devotee invests the lion’s share of power in the leader, the leader nevertheless depends on responses and cues from his audience to manage that power. A clever narcissist will feed his admirers with what they want to perceive and feel.
If the guru wants to maintain power, he must manage the demands of his cult, and he must perform convincingly. His identity depends on how well he evokes devotion from his cult following. It is a job with intense duties. All living rivals must be devalued. In totalist cults, the demands can be overwhelming, even on the most narcissistic of gurus. Like rock music idols and entertainers, charismatic gurus need breaks from performing for devotees. It is not unusual for well-established gurus to create devotional tension by remaining inaccessible to their cults for long periods.
Once established, charisma acts like a psychic leash on a devotee’s emotional and spiritual life. This powerful link with the leader is one-sided because only one can control the leash. Some in my business call this mind control. A system of managed beliefs, rituals, and regulations sustains the personal connection with the central figure, idea, or object of devotion. Ironically, in cults the devotee often agrees to the leash arrangement because he is convinced that he needs one.
The leader and group will supply the new recruit with an interesting foundation myth that supports the leader’s claims to authority. The foundation myth generally reflects a profound spiritual experience and private journey the leader has taken. As a full-fledged guru, the leader’s story often contains an initial reluctance or confusion to take on his mission. This hesitancy indicates that the leader was not merely naive—he actually applied some skepticism and tested the spirits!
Once enlightened about the calling as a new messiah, prophet, or avatar, the leader exhibits courage against all odds to proclaim his mission. Like Christ or Joan of Arc, the leader experiences barbs of criticism and social persecution. And the leader is often quick to point to that similarity: “They persecuted Jesus, too!” Both the popular press and rival groups can be especially cynical and demeaning about the grandiose claims of cult leaders, thus fulfilling the persecuted god complex.
As to a leader’s magical self-story of enlightenment or entitlement, listeners can take or leave a collection of extraordinary tales that no one can readily prove. If there is a counter history to the leader’s claims, then the group’s job is to deflect any and all disconfirmation. Doing this is not easy if the basis of the irrational claim or the evidence for it comes solely from the leader’s testimony. Thus, the circular ideation of most cult leaders: “If you do not believe me, just ask me.” (Similar justifications based on solipsism come from the devotee: “It is my experience that it is true because I resonate with the leader and the path works for me.”) Whether you can judge a cult leader by his fruits is a loaded question, but it remains the primary avenue for critics. Devotees, in defense, will argue that outsiders cannot grasp the true value of the leader or group experience by what that leader or group appears to do. In the devotee’s mind, the grand purpose is possible, no matter what appearances indicate to outsiders.
We often use phrases like “circle of friends,” “family circle,” and “circle of influence” to describe social contracts and relationships. A gang can be a “ring of thieves.” The implication is that the bond surrounds something or someone. The bond is an organizing principle. We as outsiders sense that the system is closed up within the circle. To enter it, one must join by enduring some kind of initiation. We prepare to climb through levels of improvement and awareness.
I suggest here that participating in a harmful totalist cult is not so much being inside a circle as moving on one. Circular tension is a perversion of that proverbial “razor’s edge” known to seekers familiar with Eastern religion and Somerset Maugham’s 1944 novel by that name. Maugham quotes Hinduism’s Katha-Upanishad (III:14) for the epigraph of The Razor’s Edge: “The sharp edge of a razor is difficult to pass over; thus the wise say the path to Salvation is hard.” Another translation by Sri Aurobindo says it this way: “Arise, awake, find out the great ones and learn of them: for sharp as a razor’s edge, hard to traverse, difficult of going is that path, say the sages.” 
Cults that are of concern to a deprogrammer manipulate dynamic relationships to keep members focused on a dubious purpose. Members do “the work” constantly, sustain rituals of self-improvement, maintain doctrinal thoughts, and strive to serve a transcendent ideal. They never completely understand that ideal (as well as the leader or managers apparently understand) but they can certainly approach that goal by doing certain things. In fact, most cults rely on a leader’s progressive exposition of the foundation myth that has many irrational therefore questionable features.
The initiate’s apparent upward climb to enlightenment or deeper relationship with God soon reaches a plateau. This plateau may have more to do with common human limitations than any thing else—there is no evidence that psychic powers are repeatable under test conditions, for example. Veteran cult members and even the leader seem to be no more enlightened at this “advanced” stage of membership, yet the newer member persists because she made a commitment and continues to hope for new gains and revelations. Her brain may have the capacity to question or doubt her spiritual attainment, but a powerful cult suggestion tells her, “Do not go there” or she will fall prey to the exit perils.
The leader is the only one who represents a fully enlightened being, so only the leader can act as a guide to her enlightenment. The catch here is that unless she is enlightened like the leader, she cannot grasp and achieve the full mystery of the revelation, a revelation that the leader changes from time to time to suit his needs anyway. So she continues meditating, chanting, taking workshops, performing seva or services, adjusting her attitude, praying, and adapting to new revelations.
Fear and doubt are the devils that members suppress daily. As my hippie-era colleagues used to say, “You just keep on truckin!” But the illusion in deceptive cults is that participants are on a path to climb the highest mountain, to move up a ladder to total freedom, or that the path as journey is the truth—they are already there if they are on the way! Just keep going. From the deprogrammer’s perspective, harmful cults tie people to a circular path that seems never ending and yet never quite fulfills the promises.
One might argue that most of life is repetition anyway, with rounds of workdays and social functions, so why should cults be any different? Native American jargon says that we live in a hoop, as indicated by the sun, moon, and stars going around us, with seasons coming and going regularly. Major religions all have cycles of devotion and repeat rituals. And New Agers like to point to the “nonlinear” circle of life. The circle of cult life by that standard seems normal and natural. Indeed, cults as devotional circles are natural to human social function, and this aspect may indeed be necessary in the larger scheme of human development. That is my point. Deceptive cults imply that one can break free of this humdrum hoop of a world, but that to do so, one must join an even tighter, more extreme form of hoop as defined by that cult. One cannot be normally or naturally circular. Being intelligently if moderately Jewish, Muslim, Hindu, or Christian is not good enough. Did not Jesus say, “The lukewarm I will vomit from my mouth?” Of course, being moderate has nothing to do with being lukewarm—which is an equivocation—and the point of this third element of tight circular regulation or ideation.
Over the decades, I have engaged in thousands of intense discussions with members of hundreds of different cultic groups. Many hundreds of these discussions ranged over a period of days during interventions, with family members present. Most cult members I meet are educated, have good reading abilities, and can function rather well as citizens in the surrounding community. However, they follow a conditioned urge to use their considerable mental skill to justify being on a controversial path in the face of criticism. The cult member talks in circles, so to speak, or loops of jargon, to deflect troubling evidence. Lifton identifies this as the “thought-terminating cliché.” If you dare to challenge fragile but stubborn beliefs, you may have to deal with equivocation (they persecuted Jesus too), solipsism (this is my truth and it is my experience that it is true), and tautology (those who know, know) followed by scientific and historical nonsense (e.g., Tesla’s invention of free electricity was eradicated by big oil, and the Catholic Church destroyed the manuscripts that contain the true teachings of Jesus).
An elderly doctor who was also a college professor called me to express his frustration that he could not talk with his thirty-seven-year-old son any longer about his son’s cult. He had stopped trying to reason his son out of the group years before because it merely led to ugly argument. The father used his considerable knowledge of science, religion, and history to no avail. The son, also a medical doctor, was involved with a New Age cult for six years and was about to leave his clinic to personally serve the leader, at her suggestion. This was a much deeper leap of commitment for him on his spiritual journey. He was finally becoming “serious” about his spiritual life.
The family engaged me to try to repair open discussion between father and son. When the son came home for a short visit, I walked in with one of his sisters for breakfast. The father introduced me as the surprise guest. I introduced myself and my purpose for being there. The son agreed to meet for a few hours. After two or three days, with me guiding a family discussion, and after getting him to talk to an ex-member on the phone, the son left the group.
When it was all over, the father called me “a magician” because he watched his son change from a true believer to a former member right before his eyes. But what I did had nothing to do with magic. I merely drew the young doctor into a wider frame of reference than his cult allowed. I tapped his suppressed ambivalence about the group—deep inside he struggled with many of the irrational demands that he could never acknowledge before he met with me. After accepting the group, he became committed to his commitment; thus, the circular ideation. After six years, he still had a difficult time explaining to his father why he was loyal to the leader—he just was. The week before I met him, he had proclaimed to his girlfriend that he would do anything the guru asked. In his mind, he had achieved a state of total submission, and he stopped questioning why.
During the first five to ten hours of our meeting, it appeared we were getting nowhere with this young, sensitive doctor. The father had heard all of his defenses before, but this time there was a difference. I could sympathize with the son’s philosophy and could add to it by explaining origins of the foundation myth. I entered rather than attacked his circle of thought. I offered insights into the leader’s life without criticizing the system. Criticism came much later. Those first hours were dominated by the circular ideation that can frustrate a critic who engages a cult member. The young doctor could have left the discussion at any time but did not, out of respect for his father. That was the only leverage we needed in this case initially. The rest came when I piqued his interest with new and nonthreatening information.
Many a skeptic who confronts a hard-core cult member will tell you, “I tried to reason with him but got nowhere. We just went round and round about the same things, it seemed.” Circular arguments end when one party says, “Well, you have your reality and I have mine. We will just have to agree to disagree. Unless you have my experience you cannot know if this is true or not. You are coming from your head, and to know my truth you have to come from your heart. If you doubt it, you cannot know the truth. If you are not enlightened, you will not understand.” These statements are thought-stopping clichés that end discursive thinking as well as dialogue. Self-expression becomes a monologue. At this stage, the frustrated skeptic has images of hitting his opponent over the head with a hammer to knock some sense into him. The cult member feels that he or she has just won and continues truckin’ merrily along. All so-called seeds of doubt the skeptic tried to sow soon burn up in the heat of ritual or blow away in the winds of dogma.
If anything defines mind control, it is circular ideation or a fixed mind set. Tethered ideologically to a leader’s revelation, the devotee will adjust his or her thoughts and impulses to sustain the least resistance and to stay in a flow. Leadership or cult management can jerk the chain of influence to get the member back in line if he drifts too far. Management can snap a whip if the devotee comes too close. The leader’s domain of authority is inside the circle—no one else is allowed there without permission. Think of a dressage trainer with a young or untamed horse in a ring (I was introduced to training horses this way, so I have some idea). In contrast, a well-trained or experienced horse will hardly tug on the lead and needs only subtle movement from the whip to guide.
Of course, this is a metaphor. Human beings are not horses, but ex-members of controversial cults with totalist features can identify with the metaphor. To carry it further, from the ex-members’ perspective, every member felt the all-seeing eye of the leader on their backs whenever they ventured forth to work in society, fundraise, or recruit new members. Internalizing the guru or leader who acts in the place of God or as God is a common cult experience. “I live now, not I, but Christ lives in me,” said St. Paul and say members of Bible cults who believe that their “anointed” preacher speaks for Christ. “The guru is greater than God,” say Sant Mat-related cult devotees. “The guru arranges everything.”
It is a mistake to think that cult members are stupid. Most of them will argue circles around you. Unless you are well-prepared and very patient, you might end up feeling stupid after a session with a cult member. Muslim fanatics for example, include doctors and other professionals who know very well how to justify a Muslim Brotherhood version of Sharia or Muslim law. The question is Why? Part of the answer is in my first point, of transpersonal purpose. The devotee comes to a belief that a grand purpose is not only necessary but attainable. Therefore, he or she will act in faith to further the process through self-sacrifice and attempted ego-extinction. The other half of this bargain regards what is outside of the circle, or that perilous area of social and psychological interaction that continually tests the devotee.
“To leave this path is like a dog returning to eat its vomit.”
“Now that I’ve found God, Satan is everywhere trying to take me back.”
“Satan acts through the ones you love.”
“Now that you are on the Path, dark forces will attempt to dissuade and harm you. Your own mind will rebel with doubts and counterarguments.”
“It would be better for you to have never found the truth than to abandon it. Traitors will suffer ten-thousand lifetimes, without another chance for redemption.”
“If you reject the protection of the master, your karma will descend and you will be vulnerable to accident, illness, and insanity. The dark forces will roost on you like ravens on a corpse.”
“The greatest tests appear when you are on the verge of victory.”
Ah, the adventure of it all. Imagine life without all that drama. Totalist cult devotees cannot. They are on a razor’s edge of self-discovery and service. It is a thin line that sustains a loyal circle around a leader’s high demands. Like the tightrope walker, cult members fear falling to personal harm without the group safety net. If they fail to serve well or if they express doubts, the leader just might pull the net and sever their leash. They will spin out into the social quagmire that only wants to swallow them into rounds of fast food, popular television, mundane jobs, unbridled sex—and worst of all, they will become part of that ignorant mass of clueless humans on their way to perdition.
Their immediate instinct will be to find a way back into the graces of the group and guru. If they do return, they will be treated not like a prodigal son but more like a neophyte, with suspicion and higher demands on their loyalty. Yes, every sane religion or ethical society also will warn members of these same materialistic distractions to a good life. And, yes, some ex-members do fall into a self-destructive lifestyle by living it up after they leave a restrictive cult life. But most do not, and most become productive citizens after making some difficult adjustments. But in all cases, the exit perils are very real, and not just threats.
Many members of totalist cults decide to leave the group years before they make a physical break and announce it to the group. Why is this? Can they not just walk away? No, they cannot because the exit costs are usually very high. Could you just walk away from your promises, job, property, investments, closest friends, marriage, children, or God, even if you had a good reason? What would it take? How would you prepare? When would you be ready? During the process of deciding, how many times would your ambivalence kick in to change your resolve? What are you prepared to lose? How would you know that the alternative is any better? What if the guru is right? The very thought of going through with such a decision could drive you temporarily insane. The guru warned you about that, did he not?
Who can these questioning members trust for information about spiritual matters, career, identity issues, or medical needs? For years they absorbed the cult’s spin on all of that, and now everything is in doubt. Who are they now, and what is their purpose in life? Those same questions got them into this mess to begin with. Someone gave them ready answers, or at least a way to find them—now what? Dare they ask those same questions again?
The reality of the closed system shows its power over members when they dare to defect. Benjamin Zablocki suggests that brainwashing is most apparent when cult members consider leaving the group or belief system. Their very life and mind are in peril when they choose to leave a totalist cult. In some cases, death threats are literal, but most are metaphorical. Cults that arbitrarily without due process decide who is saved and who is not, or who has rights and who does not, “dispense existence.” The dispensing of existence is the last of Robert J. Lifton’s eight themes that comprise a thought-reform milieu or process (see his eight themes in the Appendix of his book Thought Reform and the Psychology of Totalism). In his final analysis, this one theme practically defines a cult. It is the “them verses us” mentality that Arthur Deikman defines in his book by that name.
In choosing to defect, the group member must come to grips with what happened. Spiritual rape is a common description. For some, rape may be too strong an image; but it is nevertheless very unsettling to realize that they have shared their most intimate selves for years with a deceitful, perhaps nutcase guru who is incapable of truly guiding them. Then there is the peril of facing their own functional integrity. A body of believers can participate wholeheartedly in the delusions of the lead person—cults, like persons, can have personality disorders with delusional features (for example, grandiosity), albeit shared ones, as in folie de groupe. Thus, doubting devotees may feel imperiled by the very possibility of recognizing that their behavior in the cult was madness in action.
Deprogramming works when it reduces the perils of the exit process. Deprogrammers do this by reality testing questionable beliefs and perceptions with the client. Insights from the lives of ex-members from the same and other cults help: “If they survived and thrived under worse perils, then so can you.” Undermining the authority of the leader with solid scholarship and accurate history brings the ex-member not only to eye level with the leader, but also takes the leader out of the center of the circle, thus removing the illusion that she is transcendent.
The living leader is no more transcendent than a group member is. At eye level, the member can assume authority and control over personal choice. Life feels perilous when he is not in control, and dependency increases. With self-reliance restored, he can take the reins of his horse, so to speak. The deprogrammer is like a coach, or a “horse whisperer” who convinces the wary animal that crossing a creek to leave an enclosed area is not so dangerous. The creek is a metaphor for all the phobias and imagined threats that cult members avoid. Some of these phobias remain even after the exiting member has crossed the “most dangerous” waters. This is not an easy process if he is not used to riding alone [taking back his life], especially on a skittish horse, but it is a clearer one with known possibilities after a successful intervention.
An ex-member may notice a wide variety of people and average folks whom he used to look down upon from his elite cult position, and notice that these “lesser beings” are doing quite well. At eye level, he finds it easier to identify with group outsiders again, or maybe for the first time, if he was raised in an elitist cult. Normal life and plain religion now appear very exciting when he realizes that salvation and enlightenment are still possible, and probably even more so than before. He no longer has to wear a social and intellectual straightjacket, converse with angels, or be able to read auras for God to love him. He no longer needs to pursue a grandiose future to feel good in this normal life.
If the leader and cultic group were unhealthy, then, by contrast, there can be healthy groups and leaders out there. The deprogrammer should offer some guidelines to get the ex-member started but not direct specific choices of healthier life affiliations. The deprogrammer’s invitation is for the cult member to get off the tiny merry-go-round of a haunted and badly managed theme park. The deprogrammer encourages the client to exit into a greater circle of life and to apply thinking skills held back by cult constriction.
The illustrations that follow present, as I see it, the unhealthy and healthy models of cult activity. In the figures and discussion, I mean by “surround” to indicate the social environment, assuming that it is reasonably democratic and not a totalist political system or abusive dictatorship (I discuss this term more fully following the illustrations). Ironically, more cults tend to appear in open political climates that tolerate social experimentation than under dictatorships that do not, but they tend to thrive in chaotic cultures. In my view, cult formation is an essential human tendency and not an aberration. Unfortunately for scholars, the popular mind fueled by media reports and anti-cult literature has identified only aberrant social groups as cults.
First we will glance at a few symbols that illustrate the structure of a cult.
Starting from the left side of Figure 1, one common symbolic structure for cult formation is the triangle. It indicates a dominant management force or leader at the top overseeing or lording over a sequence of social layers, with a mass of subservient devotees on the bottom. Next is the square that symbolizes being “boxed in” by facets that experts in the field have variously labeled. Arthur Deikman posited four sides of this box: compliance with a group, dependence on a leader, avoiding dissent, and devaluing outsiders. Janja Lalich, with her “bounded choice” theory, defines a cult with four attributes that form a self-sealing system: charismatic authority, transcendent belief system, systems of control, and systems of influence. Steven Hassan offers four attributes of “cult mind control” in his BITE model: behavior control, information control, thought control, and emotional control.
The circle is perhaps the obvious and most elegant illustration, as both a symbol and a metaphor for cult formation: circle of friends, inner circle, sphere of influence, encircled, and so on. The model I propose expands on the circle to help me explain the reality of harmful cult experience. In Figure 2, a conical shape illustrates the “ideal path” that seekers are led to imagine when they enter a transcendent belief system that promises total freedom, enlightenment, or a way out of mundane or sinful earthly life. The ideal path appears to spiral up, up, and away into heaven, infinity, or nirvana. The guru is already “there.” The devotee strives to make his way to salvation guided by the guru. In harmful systems, the devotee feels progress in the beginning but soon gets stuck between the perilous “fall” back to where he started and the impossible or inaccessible space ostensibly occupied by the leader. The devotee remains on a narrow ledge (mimicking the razor’s edge of Buddhism), feeling the tension and excitement of being on a “high” path.
Looking at the illustration of the seeker and the leader in Figure 2 from above, we see something like the models in Figures 3, 4, and 5. Figure 3 offers the actual view an outsider or critic will have of someone in a harmful cult. The circular path that the group member believes is a spiral upward is actually a pit or rut, wherein a restrictive lifestyle keeps the member sealed off from both the social surround and the sacred domain of the leader.
In Figure 4, the unhealthy cult devotee perceives the path as progressing toward enlightenment and perfection while rising “up” to spiritual freedom, ascension, and immortality. Circular movement gives the illusion of progress.
Moving to a healthier cult system in Figure 5, we see an expansion into the surround with less restriction. We find that devotees recognize a more democratic relationship with leadership, with mechanisms to replace leadership when necessary. The transcendent reality remains as transcendent to a living leader as it does to devotees—all can fall, all can rise equally, all can find inspiration, none are “God.” After writing an earlier draft, I came upon this same concept by authors David Johnson and Jeff Van Vonderen (1991) in their analysis of abuse in Christian churches. Their illustration has the same democratic relationship that leaders and members should have with the transcendent (Jesus in their illustration) and with one another.Figure 2
Figure 4 shows the harmful cult system as closed around the membership if members are to sustain a transpersonal purpose and avoid peril. Doing rituals, transformational sessions, recruiting, and fundraising always trumps the discursive activity of examining internal doubts and entertaining surrounding criticism. In Figure 5, the expansive, second model of a healthy cult (yes, there is such a thing), we clearly see an enclosed arena of activity that nevertheless sustains easy access both socially and intellectually with the surround (the social and intellectual environment). I borrow the term surround as applied by self-psychologists who follow the work of Heinz Kohut (1913–1981) who significantly advanced Freud’s analytic approach to psychotherapy. “Narrowly conceived, self-psychology consists of ideas of Heinz Kohut, ideas that apply to the understanding and treatment of narcissistic disorders.”
Narcissism as both a behavior trait and a disorder appears in discussions about cults and cult leaders; thus, my interest in Kohut and how he used the surround to augment assessments of self. In Kohut’s psychology, some measure of narcissism may be healthy, just as in my discussion here, certain cult formations can be healthy. Malignant narcissism appears in totalist systems that harm participants and society. Cults as closed authoritarian systems create perceptions about the social environment and greatly influence interactions with that environment. In that process, a manipulative cult will tap and feed the narcissistic tendencies of recruits with grandiose transpersonal causes and infect the recruits with flawed perceptions of peril projected onto the surround. I believe Kohut’s insights regarding the self as part of an interactive social structure can add value to this discussion. Here I only wish to alert the reader to why I use surround in my illustrations.
In the healthier version, the group member has ease of contact with the surround, as well as reasonable entry and exit, with no hidden agendas either way. The transpersonal purpose is not confused with the person of the living leader or guru. In other words, until the leader is dead and gone, he is just as human as his followers, albeit with a special role. He must serve the purpose, not have the purpose serve him as if he were God or a god. There is no such thing as a living god. Gods are spirits, if they exist at all. Even in Christianity, a religion that claims a living deity in the historical Jesus, we read of the struggle among the Apostles to recognize “God” as Jesus until after his death and reported resurrection. Similarly, the avatars of Hinduism exist as divine creatures only in Hindu scripture and on some devotional levels. Any claim by a living guru to be the tenth or Kalki Avatar, for example, is bogus until he dies and “earns” that designation through a living testament to the fruits of his labor. The quality of the tradition is what we can criticize when the “divine” person is gone.
For example, the Self-Realization Fellowship (SRF) founded by Swami Yogananda in 1925 posits the mysterious, a-historical Babaji as the divine root inspiration for the lineage of SRF gurus. Babaji can function as the traditional embodiment of the transpersonal, much like Jesus does in Christianity, but the embodied or living leader cannot, in my view. To the extent that any devotee sees the living guru as having achieved a transpersonal state is the extent to which the devotee risks living in a closed system controlled by the guru. The only humans I know who handle divine power well are those who can hold molten steel in their bare hands indefinitely.
I remind the reader that these are my models that assist me in helping my clients assess their group experience. I do not make exceptions regarding the God confusion. No living leader is God or a god. Many traditions have deified a living leader, such as Caesar; but during Roman inaugurations, a slave stood behind a triumphant general and chanted, “Memento mori [remember, thou art mortal; remember, you will die].” In a similar vein, during papal coronations a plain Catholic monk holds a pole on which burns a common piece of flax. Once the flax stops burning, the monk thrice repeats, “Pater sancte, sic transit gloria mundi [Fame is fleeting, Holy Father; remember, you are mortal].” Cult leaders and dictators who take center stage as objects of devotion tend to avoid this admonition.
As self-object to his adoring throngs within a cult circle, any leader can be caught up in a divinization mood. If that leader already has unfulfilled needs for adulation, a disorder of malignant narcissism, then the totalist cult emerges readily; but the fan or devotee is just as responsible for the deification. This is always a two-way process. Robert Lifton called this “ideological totalism,” wherein the immoderate desires of a group meet the grandiose ideas of a leader. The leader presents a convincing possibility that he has attained transcendence or embodies the transcendental purpose, and the group says, “We want that, too. Show us the way.” This meeting ground takes on a “momentum of its own,” says Lifton, beyond the initial goals envisioned by the leader or the followers.
Feeding his narcissism, the leader accommodates the devotion and finds ways to control the play of forces that surround his position. The more cult members get “caught up” in the irrational nets of devotion, the less likely it is that they will have their rational feet on the ground. This is an unstable position because humans are not gods. To sustain the god illusion, the group and guru must devise strategies to frame perception. Phobias and paranoid responses inevitably arise due to conflicts with reality, thus creating the circle of peril. Critical responses from the surround inadvertently feed the peril by fulfilling cult-induced perceptions of an enemy ready to destroy the seeker’s soul by creating doubt and inviting defection from the divine path.
Acknowledging the transcendental goal does not mean it is achievable, necessary, or even desirable. How we live with God may be more valid and viable than becoming God. To use another metaphor, we can acknowledge that the sun is necessary for our existence, but that does not mean that being closer to the sun increases our existence. Narcissistic leaders would have us believe that their techniques can hurtle us toward that sun of transcendence, while skeptics ridicule their antics and opponents curse their lies. One image of pseudo-transcendence comes from Transcendental Meditation devotees (TMers) who claim to “fly” as they hop around while holding a seated lotus position. The group members call this the first stage of “yogic flying” and will produce many pseudo-scientific studies to support their sacred claims. Manipulative cults have come up with a wide variety of Towers of Babel for millennia.
To expose the false beliefs, the deprogrammer must not only convince the cult member that his perceptions are not defective but also point out how the group managers and leader have manipulated those very perceptions and behaviors. Moreover, a cult member will choose to defect only with a realization that a less restricted mind offers better options for a better life. My job, the deprogrammer’s job, is to reinforce healthy ways of using information. Brain science indicates that a healthy brain is one that continues to stop and think. As Kathleen Taylor indicates in her book Brainwashing, a healthy brain function is not stuck in thought patterns or to a flawed organizing principle like an addiction to a drug, a false belief, or a highly constrained social system.
In effect, the deprogrammer’s job is to raise the seeker’s awareness back to eye level with reality, thereby both reducing the perception of exit peril and exposing the false authority of a leader. He does this to some extent by repeating the process that got the cult member into the closed system. After gaining rapport, the deprogrammer unveils new ways of seeing cult experience and behavior. New information may surprise, intrigue, and attract the cult member to want to hear more. The result is a wider frame of reference, with clearer options for choice. With access to reliable, reasonable evidence and insight into better options, the member can navigate safely through an exit and beyond, as Figure 6 illustrates.Figure 6
Interventions vary in intent and intensity based on need and the current status of the cult member. I am not about to describe the process of exit counseling in depth here. For that, the reader can turn to other sources (Giambalvo, Hassan, Langone). My purpose is merely to suggest that, to better advise a client regarding intervention approach, an exit counselor must determine which stage a cult member is in.
The seeker expresses curiosity after he has read literature, attended one meeting, or tried a new technique for the first time. He has an attraction to, but does not yet express any identification with, the group or movement. Locus of control remains in the self, which continues to make choices with a wide frame of reference to the environment, family, and friends. Intervention at this stage is relatively less intense. A good Internet exposé of the cult, a critical book about it, or a conversation on the phone with an ex-member or exit counselor can all work to curtail that attractive “leap” of faith or entry by the individual into a manipulated experience at a workshop or service.
The seeker has gone to a weekend or week-long intensive and comes back glowing with affirmation. The seeker cum member engages in positive talk about the group and makes effort to recruit. The member deflects any negative information and may not engage in argument. At this “honeymoon” stage, the exit counselor will advise the concerned persons against argument or sharing negative information with the new member. A formal intervention will require significant preparation of the concerned persons prior to any meeting between the exit counselor and cult member. “Preps” vary according to an exit counselor’s style and approach. Some counselors may require several days of therapeutic sessions and even months of effort to regain rapport with the cult member before intervention. Of course, noncoercive access to a meeting with the cult member must be possible to arrange. Typically, if it is to succeed, the actual exit session can last two or more days, and maybe a week.
At this stage, the cult member has been in for enough time (usually years) to have reached saturation point regarding what the group actually offers. The member may even be one of the sub-leaders or elders, and she has seen and experienced much of the inside conflict and abuse but will not define it as such. She may even be aware of many ex-member stories, yet she has no emotional empathy and intellectual integrity to see any criticism as significant. As above, the concerned persons must refrain from argument with the member about the group for intervention to be possible. Preparation with the exit counselor and intervention team is crucial.
Many long-time cult members grow weary of core group life, but they have become so identified with it and accustomed to ritual that the alternatives still feel worse and perhaps even dangerous. Sometimes these well-seasoned members exist on the fringes of the movement, attending only the necessary functions; they have jobs and live outside the group. Eileen Barker, Ph.D., calls this group “the marginals,” who live as believers, yet closer to the extended environment in a nonexclusive style. Interventions with such marginals may be easier to arrange, but sessions to exit them will prove to be exacting and tedious unless one is well-prepared and broadly educated about the group and its historical context. The exit counselor will have to be prepared for deep discussions about the “meaning of life” and how to assess “truth” in any religion or philosophy. Nevertheless, I have found that the same tools (videos describing cult behavior and influence techniques) still prove equally effective with marginals who may have never considered the information.
Deikman, Arthur (2003). Them and Us: Cult Thinking and the Terrorist Threat (Berkeley, CA: Bay Tree).
Giambalvo, Carol. Family Interventions for Cult-Affected Loved Ones (originally published as Exit Counseling: A Family Intervention). Available as PDF document at http://store.icsahome.com/merchant.mvc?
Hassan, Steven (1988). Combatting Cult Mind Control (Rochester, VT: Park Street Press).
Johnson, David and Van Vonderen, Jeff (1991). The Subtle Power of Spiritual Abuse (Minneapolis, MN: Bethany House).
Lalich, Janja (2007). Bounded Choice: True Believers and Charismatic Cults (Berkeley, CA: University of California Press).
Langone, Michael, editor (1995). Recovery from Cults: Help for Victims of Psychological and Spiritual Abuse (New York, NY: W. W. Norton & Company).
Lifton, Robert (1989). Thought Reform and the Psychology of Totalism: A Study of Brainwashing in China (Chapel Hill, NC: University of North Carolina Press).
Singer, Margaret with Lalich, Janja (2003). Cults in Our Midst: The Continuing Fight Against Their Hidden Menace (San Francisco, CA: Jossey-Bass).
Taylor, Kathleen (2004). Brainwashing: The Science of Thought Control (New York, NY: University of Oxford Press).
Zablocki, Benjamin and Robbins, Thomas (2001). Misunderstanding Cults: Searching for Objectivity in a Controversial Field (Toronto, Canada: University of Toronto Press).
Joseph Szimhart initiated his work as a cult specialist in 1980 after ending his two-year devotion to a large New Age sect. He began to work professionally as an intervention specialist after 1985 on an international scale. From 1985 through 1992 he was chairman of an interdenominational, cult information organization in New Mexico and lectured throughout the state. He has written reviews and articles about cultic issues for Skeptical Inquirer, Cultic Studies Journal, Cultic Studies Review, and other publications. He continues to consult for the media and maintains a website for information about cults. For family reasons, he reduced his exit counseling work since 1998 when he began a job with a psychiatric emergency hospital. Mr. Szimhart continues to pursue his fine art career. (email@example.com)
Cultic Studies Review, Vol. 8, No. 3, 2009, Page  http://en.wikipedia.org/wiki/Deprogramming. There are no formal, agreed upon definitions of deprogramming, but this Wikipedia entry offers much of what is available.
 “Persistence of ‘Deprogramming’ Stereotypes in Film,” by Joseph Szimhart, 2004. Cultic Studies Journal. http://www.icsahome.com/infoserv_articles/szimhart_joseph_persistenceofdeprogrammingstereotypes_abs.htm
 Benjamin Zablocki and Thomas Robbins, editors, 2001. Misunderstanding Cults: Searching for objectivity in a controversial field. Toronto, Canada: University of Toronto Press. See especially Part 2 and chapter 5 for Zablocki’s clarification of “exit costs.”
 Johnson and Van Vonderen, The Subtle Power of Spiritual Abuse, p. 230.
 The “surround” as defined by Heinz Kohut and his followers includes environmental and social influences on personal history. See: Ronald E. Lee and J. Colby Martin (1991), Psychotherapy After Kohut: A textbook of Self Psychology, Hillsdale, NJ: The Analytic Press.
 http://www.icsahome.com/infoserv_articles/shaw_daniel _traumaticabuseincults_abs.htm (“Traumatic Abuse in Cults: A Psychoanalytic Perspective,” by Daniel Shaw, C.S.W.)
 Robert J. Lifton (1961) Thought Reform and the Psychology of Totalism: A Study of "Brainwashing" in China, New York City: W. W. Norton. See Chapter 22.
 “The Marginals: People on the Boundary of a New Religious Movement,” by Eileen Barker, Ph.D. Paper presented at ICSA Annual International Conference, Philadelphia, PA, June 26–28, 2008. http://www.icsahome.com/infoserv_conferences/2008phily/2008_handbook.htm#Abstracts