Articles‎ > ‎

On Resisting Social Influence


Cultic Studies Journal, 1984, Volume 1, Number 2, pages 196-219

On Resisting Social Influence

Susan M. Andersen, Ph. D.

Philip G. Zimbardo, Ph. D.


Abstract


The thesis of this essay is that “mind control” exists not in exotic gimmicks, but rather in the most mundane aspects of human experience. If this is true, it implies that people can learn to resist untoward influences, which are defined here as influences in which intentions are hidden and the subtle constraints of individual behavior are profound. When information is systematically hidden, withheld, or distorted, people may end up making biased decisions, even though they believe that they are freely “choosing” to act. These contexts may thus involve “mind control.” Although resisting cleverly crafted social influences in not easy, it is argued here that it is possible to reduce susceptibility to unwanted interpersonal controls by increasing vigilance and by utilizing certain basic strategies of analysis. In this paper, resistance strategies are presented which are broadly applicable to a wide array of mind-manipulation contexts. Relevant social psychological research, manuals for police interrogators, and interviews with one-time cult members form the basis for the present arguments, which blend pragmatic advice with a conceptual analysis of the basic issues on which vulnerability to persuasion rests.

Resisting social influences becomes important when such influences can be appropriately thought of as “mind control.” When information is systematically hidden, withheld, or distorted, people may make biased decisions, even though they believe that they are freely “choosing” to act. Such decisions, in fact, are likely to persist over time because people come to believe in attitudes and actions for which they have generated their own personal justifications. The thesis of this essay is that “mind control” exists not in exotic gimmicks, but in the most mundane aspects of human experience. Thus, it is possible to reduce susceptibility to unwanted social controls by increasing vigilance and utilizing basic strategies of analysis and resistance.

Within our framework, the goal of mind control is to manipulate thoughts, feelings, and behavior within some context over time. We recognize that most people agree with and adhere to at least certain social control efforts because these efforts reflect ideals that are intimately a part of their lives. These tend to be described as socialization rather than programming, education rather than propaganda, and personal development rather than brainwashing. Thus, those who convert to my church are “saved,” and those who fail to convert or who defect are doomed. Fanaticism and self-righteousness, however, are of particular concern here because they may justify and provoke the use of psychological coercion, one of the fundamental characteristics of mind control. Take a father’s concern about inculcating a sense of patriotism in his son:

I am very pro-American. I have a small son and have hopes that when he grows up he will join one of the armed forces. To ensure this, I have thought of talking to him while he is sleeping – no great speech, but a little patriotism and the suggestions that an army career would be good. (Caplan, 1969, p. 65)

One might argue that this is merely socialization, that it is natural and normal. But this argument is difficult to accept because of the deception involved. Nevertheless, it is clear that we are continually being influenced and controlled in our lives, although in less dramatic ways. Politicians influence our votes; teachers our thinking; religious leaders our morality. Advertisers emphasize our ability to make “rational” decisions between products they have apparently compared, and then urge us to buy the one of their choosing whether we need it, want it, or can afford it. Our tastes in food, dress, art, music, friends, and so on, are all acquired through subtle processes of social influence. The quality of our social interactions with other human beings fundamentally shapes our social and cultural preferences.

Not all influence methods, however, should be construed as normal and justifiable. In particular, deliberate attempts to manipulate someone else’s behavior seem exploitative when they are covert. One can always imagine that the “victim” might have resisted had the attempt been more overt or had “informed consent” been solicited. But attempts at manipulation are actually most effective when someone is led to believe that he or she is freely “choosing” to act. Once we have made a commitment, we tend to generate our own justifications for it, even when we were truly unaware of the important factors that influenced our decision at the time. Our “choice” of actions, then, is only as reasonable as the information we have available to us; and reliable information can be methodically hidden, withheld, or distorted.

Take, for example, the case of government officials refusing to warn the public about the risks of radiation fallout during the atomic bomb tests in Nevada in the 1950’s. Residents chose to stay in the area. In Oklahoma, the Kerr-McGee plutonium plant was found guilty of misleading employees about the hazards of its operation – after a long struggle to expose flagrant safety violations. On a broader level, while the Western press was bombarded with information about the United States’ restraint in Iran during the hostage crisis, there was little coverage of the war being waged by American-supplied Indonesian troops on the island of Timor which left as many as 100,000 people dead. Making decisions about both public and personal issues is enormously complex when those “in power” in our social and political worlds define reality for the rest of us. By controlling the information to which we are exposed, they restrict the range of alternatives from which we are able to freely “choose.”
The Exotic and the Mundane

Formidable quests to gain control over the human mind have often employed exotic technology. Exquisite torture devices, electroshock therapy, mind altering drugs, hypnosis, and sensory deprivation have all been used to get targeted persons to do the bidding of various agents and agencies of control. Indeed, these methods carry enough wallop to distort and sometimes destroy the mind’s normal functioning. But they are not adequate for the task of reliably directing behavior through specific scenarios as designated by would-be manipulators.

John Marks’ expose of the CIA’s secret mind control program (see The Search for the “Manchurian Candidate”) suggests that no foolproof way of “brainwashing” another person has ever been found. After a decade of intensive, costly research into the technology of such control, the CIA’s MKULTRA program was deemed a failure. Covert operations could claim little more than being capable of turning unsuspecting victims into “vegetables.”

Effective mind control exists in most mundane aspects of human existence: the inner pressures to be bonded to other people, the power of group norms to influence behavior, the force of social rewards (such as smiles, praise, a gentle touch). We influence one another, intentionally or unintentionally, using the most basic principles of social psychology, motivation, and social learning. It is people in convincing social situations and not gadgets or gimmicks that control the minds of other people. The more worried we are about being seen as ignorant, uncultured, untalented or boring, and the more ambiguous the events are that are to be evaluated, the more likely we are to take on the beliefs of those around us in order to avoid being rejected by them.
Basic Training in Compliance

What ensures the success of undesirable social influences, whether they involve buying new products, entering new relationships, or simply maintaining the status quo in a contrary environment, is our blindness to the potency that situations possess. Etiquette and protocol are powerful inhibitors of unconventional action. When people around us behave alike and as they are expected to, it becomes difficult for us to evaluate their actions critically or to deviate from what is expected of us in the situation. The kinds of social programming we are all subjected to in childhood circumscribe our perception of such behavioral possibilities with a neat cleave. The “good child” learns his place in all social settings, stays put in her seat, is polite, speaks only when spoken to, is cooperative, does not make trouble, and never makes a scene. As children we are rewarded for going along with the group and for not insisting on getting our way. It is the wiser course of action, we are taught, to go with (or around) power, not to challenge it.

By taking social roles for granted in a context, we can be unwittingly led to take on companion roles in the various scenarios being enacted. If she wants to play “guest”, we become “host”; if he is quick to assume responsibility, we passively surrender some of our own; if they are a couple in conflict, we become mediator. And once ensconced in some social role, our behavioral freedom is compromised in subtle ways. Interviewees answer but don’t ask questions, guests don’t demand better food, prisoners don’t give commands, audiences listen, “true believers” believe, rescuers sacrifice, tough guys intimidate, others recoil, and so on. Expectations about what behaviors are appropriate and permissible within the structure of a role can come to control us more completely than the most charismatic of persuaders. As a nation, we saw in the Watergate cover-up how the “best and brightest” caved in to the pressures that required “team players” to win this one for the President. Unquestioned protocol persuaded them to betray their public offices.

Those who occupy social roles that carry prestige and credibility in our eyes can work wonders. The most potent influences are eased around to us by our buddies or by reputable “experts,” rather than by those whom we think of as “enemies.” A neighbor tells us to stop by for a chat with some interesting people, our doctor prescribes a new antibiotic, a businessman offers us exciting financial prospects, brother says he’s impressed with a new pastor. Such testimonials encourage us to take the first step along most of the paths we’ve chosen for ourselves, good and bad, because such influences are basic to being engaged in social life.
Saturation and Detachment

Unlike our response to “overtly” persuasive communicators who may beseech us to buy the latest gourmet cookware, to jog daily, to elect particular politicians, or to give to certain charities, situations with “normal appearances” (see Goffman, Relations in Public) don’t seem to require skepticism, resistance, or even our conscious attention. We often move through them “on automatic” and are thus prone to being influenced without our slightest knowledge.

To counteract this possibility, people could refuse to play social roles, to seek social rewards, join organized groups, or notice modeled behaviors – but only if they are also prepared to withdraw from society entirely. Alternatively, people might choose to detach themselves emotionally from certain aspects of social life, but this has the probable drawback of leaving them without social support, friends, lovers, or anything in which to believe. Thus, while being detached enough to observe and analyze is intimately related to resisting social influences, utter detachment can lead to isolation and to paranoia. A prisoner at a federal penitentiary we know of, for example, was held in solitary confinement for several years, and then “beat the system” by turning off his emotions before the system could get to him. Now he feels nothing at all, which did not prove to be worthwhile.

Passionate involvement, serious commitment, and emotional investment are some of the richest forms of human experience. We want to be passionate, playful, and spontaneous, because these things help us to feel that life is worth living. Yet such emotional “saturation” can be problematic. People can lose themselves in their emotions, and they can become so enthralled with an idea or situation that they miss the “cues” that suggest they ought to exit or refuse to participate. People in “cults,” for example, are trained to think positively and programmatically about what they do. Viewing one’s own and others’ actions from a variety of perspectives is simply not done. Orders are followed and much information is systematically withheld. To take some specific examples, prospective Peoples Temple and Unification Church members have been asked to “open their minds” to exciting new identities, to saturate themselves with new meanings, a sense of belonging, and to refrain from being judgmental. Guru Maharaj Ji suggests liberation from one’s own mind in these terms:

So mind really gets to you, mind really affects you, in very, very subtle ways, in very, very subtle manners. And what is the reason, that I come out and I scream and yell, “Don’t listen to your mind.?” There is something within, inside of you which is much more beautiful than that crazy mind.

Perhaps we don’t want to be wholly critical and alert at all times, but mindlessness is often promoted as a way of encouraging passive acceptance at the expense of individual discretion. The hook is that when we are faced with complex problems we often yearn for simple answers and rules of thumb for how best to proceed. Immersing ourselves in the teachings of a powerful leader, in the say-so of the dominant partner in a relationship, or in the total ideology of any highly cohesive group can be comforting. But when we lose our desire to formulate unique, creative ideas in any situation we begin to lose our sense of self there. Thorough, unquestioned saturation can hinder our ability to evaluate our actions critically when it is in our best interests to do so.

The problem is paradoxical. Although detaching ourselves from social life to avoid “being taken” is obviously absurd, the more we open up to other people’s thoughts and ideas the more likely we are to be swayed by them. We want to feel connected to others in the community. We want to be “saturated” with living and to feel we can suspend, for periods of time, our evaluative faculties, our cautiousness. Yet we must be able to pull back and monitor our experiences, reflect upon the choices we have made, and assess the “goodness” of our involvements. Oscillating between these poles, immersing and distancing again at “appropriate” intervals would appear to be the best compromise and perhaps, then, a reasonable solution.

Is it possible to recognize those social influences that can distort our integrity and freedom of choice amid the many benign pressures that surround us daily? Can we act to avoid or counter these influences? These questions are complex in the sense that the most skillfully contrived situations effectively prevent us from recognizing that we are about to be “taken” or that any amount of persuasion is intended.

Nevertheless, at the prevention stage, it is important to recognize the operation of effective persuasion tactics, so that it is possible to know when one is in a particularly powerful situation. When the targeted individual feels hesitance or reluctance about a persuasive communication at the outset, of course, he or she should avoid taking even the first step of involvement, e.g., agreeing to hear more of the sales pitch, looking through the brochure, accepting the free gift. Once a commitment has been made, recognizing and resisting control tactics at the system level is critical to retaining individual integrity with the system or to arranging one’s escape, if necessary.

Because most research on persuasion and influence in social psychology has been undertaken from the point of view of the powerholder, not the consumer, there is virtually no experimental literature on the business of resistance, i.e., on what the consumer might do to be able to resist persuasive messages and influence strategies. Nevertheless, it is possible to examine the literatures that do exist for clues as to useful means of resistance. The strategies that follow have thus been drawn from a diverse body of information, including: (a) psychological research on persuasion and attitude change, the situational control of behavior, and social learning principles of behavior modifications and self-control; (b) training manuals for police interrogators (see Psychology Today, June, 1967) and sales personnel; and (c) interviews and personal experiences with one-time “cult” members.
Developing a Critical Eye

To assert the freedom to choose options that are not apparent in any situation, we must be simultaneously committed to our social worlds and sufficiently disengaged from them to maintain a critical analysis. For this reason, developing a critical eye is central to counteracting compelling social pressures, whether they occur one-on-one or within a social system. To acquire the kind of sensitive skepticism needed to detect undesirable influences when they arise, people must learn to be vigilant to discontinuities between the ideals people espouse and their concrete actions. Separating the preacher from the practice, the promise from the outcome, the perceived intention from the consequence is at the crux of resistance because it is too easy to mistake the label for the thing labeled, to deal in symbols and concepts instead of people and their behavior.

Many notable politicians, for example, gave their support to pastor Jim Jones without questioning why he was surrounded by a half dozen guards, why his church had locked doors, and why newcomers were searched before being approved by the Welcoming Committee. Peoples Temple members admired “Dad” because he cared for them and because he said he cared most of all about the children. But they failed to critically appraise or to even acknowledge the reality that he punished them severely (at times with electric shock) and subjected them to public ridicule for minor transgressions.

The biggest lies are often hidden by a compelling context and are discovered later on the basis of discontinuities that in hindsight are obvious. The unanticipated nightmare of the slave labor camp Jim Jones created in Guyana thrived on his systematic distortion of every detail of the reality of Jonestown: there was mild weather, he said, and abundance of food, no mosquitoes, easy work days, no sickness, no death. The discontinuities were there to be perceived. “The moment I got off that plane I knew something was wrong,” said Richard Clark, who led an escape party out of Jonestown through the jungle the morning of the massacre. It was the opposite of what had been promised – a jungle hell where people worked long hours on menial jobs in sweltering heat, often hungry and sick. But denial en masse of these obvious discrepancies kept Jones’ system of total mind control going until the very end. For most non-defectors, however, the situation had deteriorated slowly but surely from a tolerable to an unbearable one, amid strict social norms about “positive thinking.” According to Margaret Singer’s extensive studies of former cult members (see Psychology Today, January, 1979), those who left cults without the aid of deprogrammers did so because they had “grown bitter about discrepancies between cult words and practices.”

Comparing the concealed purpose of a communication to its manifest content is, in fact, one of the central tasks in analyzing all propaganda. It is not unlike decoding what we think of as “Freudian slips” in which the “error” conveys the speaker’s intention. Too often we overlook blatant discrepancies by automatically supplying semantic corrections that render statements or situations into “good form,” thus allowing contexts to cover over discontinuities.

Because effective manipulators provide as coherent a situation as possible in which to gain our compliance, detecting discrepant or ulterior motives is difficult. Although becoming obsessively critical or suspicious would be dysfunctional, carefully appraising the credibility of the source of a message and the quality and intent of the appeal makes sense. On the other hand, most persuaders recognize the importance of standard operating procedures, that is, of form and style in their persuasion efforts; these appearances undercut our ability even to recognize that a persuasion effort is taking place. According to sociologist, Erving Goffman, persuasive individuals typically conceal their intent amid “normal appearances.”
We are more likely to go along with whatever is happening and, perhaps, to be “taken” when the situation we are in appear normal. Suppose we’re just “having fun” with friends, being “entertained” or “educated,” or are simply engaged in a common social interaction. We usually feel no need to attend to the details of what is going on, of who is influencing whom, and of what is affecting our behavior. But any variety of social pressures can prey upon the unquestioning attitude that we may don in these situations, and on our adherence to standard protocol. Being able to disobey simple situational rules when we feel we should is important – both for men and for women – and requires, at a minimum, an ability and a willingness to critically evaluate situations. Information from Rape Prevention Centers, for example, suggests that entering dangerous situations with potential rapists may seem “natural,” tantamount to being polite or helpful, for a person who has been trained to be obedient and ladylike. Answering all questions with a friendly, gracious smile, refusing to make a scene, or unwaveringly deferring to the protection and judgment of men, even when they are strangers, is not the best idea. Nor is being courteous and open with service personnel at the expense of requesting proper identification.
Actively monitor social interactions. Establish a critical distance periodically to examine situations from other perspectives. Search for situational pressures in your physical and social surroundings, for the small details as well as the big picture. Practice thinking ahead, anticipating what will come next, checking for discrepancies, and noting how you feel about them.
Be willing to disobey simple situational rules when you feel you should, to sound false alarms occasionally, or to cause a scene. Be careful about doing things you don’t believe in just to appear normal or to get someone off your back.
Be able to recognize the conditions under which you are most vulnerable to accepting persuasive appeals (the conditions we will describe in the next section). Should a potent persuasion tactic be present in a situation, postpone making a decision on the matter, if possible, or allow yourself to say “no.”
At the very least, try to get more information so that you can carefully consider the consequences of saying “no” to something that could turn out essentially “good” (could you return in a week or a year and say “yes”?) or of saying “yes” to something that could turn out essentially “bad” (could you lose your money, pride, or life?). Obtain and utilize all available information and search for new, reliable sources, if possible.
Resisting Persuasion: Confidence, Clarity, and Persistence

Effective persuaders not only influence people, they win friends “in the bargain.” After intensive interrogation for the murder of two socialites, George Whitmore, Jr. “broke” and gave a 61-page confession of guilt. He went on to express his admiration for his interrogator, a detective, whom he now claimed to respect more than his own father. Subsequent events established that Whitmore had actually been persuaded to confess to a capital crime he did not commit.

The best persuaders always appear to be just like us. They understand our problems, empathize with our predicament; in fact, they were there once themselves. They speak our language, share our needs, and know the inside jokes. When someone appears to share our concerns, he or she becomes a cohort, an ally, someone we can trust and give the benefit of the doubt. The tactic is powerful because attitude change, like all socialization, is most effective when it goes unnoticed. The conversation is slowly led through small, continuous approximations. In the end, we perceive that we have brought it about on our own.
Check for signs of ingratiation, for an overemphasis on mutual interests, and for requests for just one small commitment now – with an open-ended contract for later. How deep do the stated similarities go? How well does the persuader really know the common friend you supposedly share?

As trivial as it may seem, a major persuasive device is the expression of confidence in the beliefs espoused and courses of action recommended. Research shows that powerful people express confidence and self-assurance across all channels of communication – through body language, through words, and paralinguistically. Regardless of someone’s “real” credibility, what we end up responding to is how competent, confident, honest, and stable he or she appears to be. Someone who looks us straight in the eye, stands reasonably close and speaks articulately is not intimidated by us, and is perfectly in control of the encounter. In reaction, those who get persuaded express doubt; they do so as much by what they say as by what they don’t say. Minor hesitations like “uh,” “ah,” “er,” or a pause can be capitalized upon and manipulated because they convey momentary lapses of thought, momentary vulnerabilities. Similarly, an unwillingness to stand up for oneself, to contradict the persuader, or to ask for clarification, can become the worst indictment – an open door for influence.

In fact, training manuals for sales personnel are filled with tactics for skillfully manipulating the choices people come to make in bargaining situations. And desired results are obtained. Millions of Americans are subjected to stress and intimidation in the presence of those whom society has termed “expert.” Automobile mechanics, for example, often make thousands of dollars each year for labor and supplied they don’t deliver. In 1978, over two million Americans underwent surgical operations that they did not need (at a cost of over four billion dollars). Because it is difficult to feel efficacious around people who ostensibly have more knowledge than we do, we are often inhibited from asking the appropriate questions, from thinking critically and carefully about decisions that may affect our lives.
Practice “seeing through” programmed responses to authority. Pay attention to the social roles you and others occupy in a setting and the subtle indicators of those roles that you may be responding to (business suit, repairman’s uniform, doctor’s white coat, etc.).
Be aware of who is controlling whom in social situations, to what end, and at what cost.
To the extent that it seems possible, refuse to accept the initial premise from someone that he or she is more powerful, more competent, more in control than you are. Accepting this premise may be, in part, what makes it so.
State you arguments with conviction if the other person does so.
Learn to retain a sense of self-worth in the face of intimidating circumstances by creating an “appearance of competence” equal to that which an effective persuader conveys through his or her voice and actions. Carry with you a powerful, concrete image, replete with tactile sensations, sights and sounds, that reminds you of your own competence. Remember a time when some person or group of people thought you were the very best; think of a violin if you are a virtuoso, a photograph, a person, or place, anything that makes you feel exhilarated and alive, that you can retain as an inner core that cannot be violated. Apparent competence can reduce feelings of helplessness in stressful situations.

Mind control typically involves coming to accept a new reality. The errors of our old ways of looking at the world are exposed as such, and a new reality is embedded in their place. Elaborate but inadequate justifications for recommended actions can be very confusing. Once confused, we can easily be persuaded by false analogies, semantic distortion, and convenient rhetorical labels because we will tend not to question them and to think about them creatively, but to accept them at face value. John Dean reminds us that the entire Watergate cover-up was shrouded in cute euphemisms, jargon, and rhetoric. Instead of referring explicitly to the money involved in the scandal, they spoke only of the “bites of the apple.” At the extreme, it is easier to “waste an enemy” or to engage in “revolutionary protest” than to murder other human beings.

Inconsistent or ambiguous descriptions with confusing terminology can lead us to accept invalid conclusions that we would otherwise resist. Research on metacomprehension reveals that this is precisely what many children do. They are able to understand the simpler component parts of a complex message so they overestimate their comprehension of it as a while and accept it as adequate. This can also take place among adults.
Never accept vague generalities when a message is actually confused or ambiguous (and perhaps intentionally so); and avoid attributing your confusion to your “inherent” inability to think about the matter clearly, especially if someone suggests that “you’re just too stupid to understand” or “women get too emotional to think logically.” Interrogate yourself about the meaning of a communication to see if the conclusions follow from the arguments, and if the expectations you form while listening are confirmed or disconfirmed.
Paraphrase other people’s thoughts both aloud and to yourself to see if you are understanding clearly.
Practice generating creative arguments and counterarguments as you listen to persuasive messages to avoid slipping into “automatic” processing.
Tentatively assess the meaning of an ambiguous situation or communication once you have some reliable information, but don’t forget that the assessment is tentative. Be willing to take new information into account.
Seek outside information and criticisms – especially from family and friends – before joining a group or making a commitment to invest time, energy, or money in some endeavor.
Train yourself and your children to notice the “tricks” in deceptive information packaging, such as those utilized in television commercials. Knowledge of make-believe constructions, of audio-visual distortion techniques, the use of celebrities, experts, and overgeneralizations can build the kind of skepticism in children which is the front-line of all resistance efforts.

Susceptibility to control becomes greater with increased self-consciousness. When people are induced to focus attention on themselves by being made to feel awkward, deviant, or silly, and to worry excessively about what others think, they can be led to resolve opinion disparities with others in the favor of the other person’s opinion.

At the extreme, Manson family member Leslie Van Houten described Charles Manson as controlling his followers through unrelenting intimidation and strict isolation. “I was always frightened of not being accepted even in school,” Leslie reported. “But Charlie played on that; he saw a danger in my humor and outgoingness . . . He’d try to make me feel I was missing something. He said I didn’t know what was happening and that I was really stupid.”
Be sensitive to (and avoid) situations and people that put you on the spot, make you feel different, awkward or inadequate.
Try to focus attention on what you are doing rather than on thoughts about yourself. Keep an especially firm handle on generating negative dialogues about yourself, and never accept a chronically negative view from someone else.
Maintain some non-social interests that you can satisfy while you are alone, such as painting, carpentry, working on cars, reading, or writing. If you can develop a concrete sense of self-worth, a sense of who you are, what you are interested in, and where your competencies lie, quite apart from the values, interests, and judgments of others, you may feel better about yourself in their presence, as well as in their absence.
Be willing to look foolish now and then, and to accept being “different” as being “special” rather than inferior.

Effective persuasive appeals get their oomph by reaching beyond reason to emotions, beyond awareness to unspoken desires and fears, beyond trivial attitudes to basic concerns about self-integrity and survival. Clever persuaders are adept at detecting what we want from a situation, what our fears and anxieties are, and what areas of supposed mutual interest will best gain our attention. Once someone has our trust, he or she can change our attitudes by inducing an emotion-laden conflict that requires immediate resolution. By making us feel fearful or anxious, the manipulator is in a position to ease our discomfort by providing reasonable explanations and soothing solutions. Much advertising is based on this principle. So are many social interactions.

A 60 Minutes documentary (1/28/79) reported that sellers of Industrial Insurance have their working class clients nearly paralyzed with fear over spiraling medical and burial costs. But relief is at hand as the salesperson unfolds the insurance policies that will resolve any uncertainties the future may hold. If the client owns other policies, they go unmentioned or are dismissed as inadequate. All that is clear is the imminence of death and an eight-inch replica of a satin-lined mahogany coffin in the hands of a credible-looking businessman who adds in a deep clear voice, “Wouldn’t you prefer your loved one to rest in a beautiful casket like this than to be buried in an old pine box?”
A crucial issue concerning our needs and vulnerabilities is whether, when, and how to reveal them. No matter what the relationship, avoid getting sucked into unwanted confessions that may later be used against you. Many cults and mind control systems utilize public confessions, self-exposure “games,” and the like to catalogue the weaknesses of their followers, for later exploitation.
Avoid making decisions when under stress, particularly in the presence of the person who has triggered the emotional reaction. Tell him you will decide tomorrow.
As you feel yourself becoming uncomfortably anxious or guilty, begin taking slower, deeper breaths to help your body relax. Imagine the air flowing through your muscles and loosening the tension in your shoulders, the back of your neck, your upper arms, and down through your chest, abdomen, and lower back. Relax. Interrupt the natural processes the persuader has set in motion.

Gnawing feelings of guilt can also provide a powerful impetus for personal change. Feelings of self-disgust, a desire to confess, to do penance, or perhaps even to experience suffering, are all potent persuaders in their own right. Simply being in the presence of those less fortunate can often be influential, particularly if we are somehow made to feel responsible for their plight. Professional beggars make it their business to make passersby feel guilty for being well dressed and well fed. Organizations that support themselves through donations often thrive upon the proceeds collected by mildly handicapped solicitors. More broadly, the pivotal contingency in Patty Hearst’s psychological transformation at the hands of the Symbionese Liberation Army appears to have been the guilt she was led to feel over her family’s privileged position, the disparity between their wealth and the poverty of so many, and her life of noninvolvement in the struggle of oppressed peoples. All conflicts were slowly relieved with each step she took in the directions of accepting her captors’ definition of reality.

Letting someone do favors for you can also make you feel indebted and guilty. Diane Louie, who escaped Jonestown with Richard Clark the morning of the massacre, recounted for us her experience in the hospital there. She was suffering from a severe intestinal virus, feeling duped and dissatisfied when Jim Jones came to her bedside. “How are your living conditions?” he asked. She shifted uncomfortably in her cot, trying not to raise her eyes to him. “Is there any special food you would like?” She thought of her stifling, crowded bungalow, the maggots in her rice, her exhaustion, the broken promises. “No,” she said, “everything is fine; I’m quite comfortable.” To us she said, “I knew once he gave me those privileges he’d have me. I didn’t want to owe him nothin’.” She was one of a handful of people able to escape the mass murder and suicide.
Be aware of the situations that provoke guilt or anxiety in you so that you can circumvent their illicit use by skillful manipulators. Learning to confront your frustrations and fears is the most potent way to prevent their being exploited unbeknownst to you. Start by thinking about the least provoking aspects of problematic situations while in a state of relaxation, and work up to more difficult ones.
Don’t let people make you feel indebted to them by accepting a definition of a situation that suggests sacrifices are being made on your behalf. Although reciprocal exploitation and need fulfillment are part of every social contract, when you feel justified in doing so, be prepared to acknowledge the sacrifices of others with a sincere thanks, instead of the expected repayment in kind.

When the opposition is about to yield, successful persuaders employ tactics of ingratiation to build the bonds of liking and respect that will extend past the initial sale. Once aware that their prey is bagged, they emphasize the victim’s freedom of choice – after tactfully constraining the alternatives. The newly persuaded person chooses “freely” while the context the influencer provides bolsters his or her decision. Properly executed persuasion never appears to be “designed” to induce change, but rather ends in a natural resolution of mutually generated concerns. New attitudes and behaviors that are accompanied by the feeling that they have been chosen without extrinsic justification are particularly enduring and resistant to change.

Skillful persuaders may also deny us our freedom in order to control our behavior with the help of the reactance principle. Studies have shown that when we perceive severe limitations on our behavioral freedom we sometimes move to reassert this freedom by advocating the opposite position, which may be exactly what the opposition wants. ”Excuse me for saying so, sir, but this is quite an exclusive line; you probably cannot afford it.” “So, you’re gonna let that guy (or nation) get away with treating you in that shameful way!” “No salesman could possibly sell more of this product in such hard times!”
Remember that reacting against someone’s dogmatic assertions about what you should do is not your sole avenue to freedom of action. Sometimes it is best to test their intentions by giving the impression you will comply so that you can observe their reactions. If they start pushing in the opposite direction or simply look befuddled, you may have uncovered a hidden agenda.
Be wary of people who overemphasize how free you are to choose among the options they have prescribed. Electing Anacin over Bayer is not the same as deciding whether you want an aspirin. Nor is the question, “How many bombs should we drop?” the same as “Should we drop any bombs?” Test the limits of your options by selecting “none of the above” or by proposing unexpected alternatives, at least tentatively, especially when you create them yourself and believe they are better.
Resisting Systems: Voice, Exit, or Rebellion

When social persuasion moves into the big time, one-on-one confidence games are not economical. The behavior of large numbers of people must be managed efficiently. For this reason, persuaders develop “systems of control” that rely on basic rules and roles of socialization and that impart a sense of belonging. When interaction among people is restricted to interchange between their social roles, however, it becomes easier for ethical, moral, and human concerns to take a back seat. Because people may be ostracized from organizations (e.g., fired from their jobs) for not complying with the requests of superiors, resisting any of these pressures may be difficult. When John Dean refused to participate in the Watergate cover-up after he himself had worked to initiate it, along with his cronies, he had to part ways with some of the most cherished assumptions of society: He questioned the morality of Presidential orders, and he lost his paycheck. According to Dean, “. . . this would never have been done had it not been done to protect a president. And for a long time I had trouble separating the man from the office.” Nazi war criminal Adolph Eichmann accounted for his actions during World War II by saying, “I was just doing my job in experiments following orders.” Similarly, in Milgram’s obedience experiments, normal people apparently inflected painful, potentially lethal doses of electric shock to a stranger at the insistence of a credible “scientist” in a learning experiment. They did their job.

Tightly structured situations are dangerous when we lose sight of who we are, and forget that we have feelings and histories other than those programmed by the immediate setting and roles we are led to play in it. In order to avoid slipping into acts that violate our integrity, we must be “present” in our societal and institutional roles as distinctive individuals. Knowing when to escape and rebel with others, requires questioning the rules that are laid out there and being alert to role-based constraints on our actions. Noting frames of reference other than those prescribed by the setting facilitates thoughtful decisions in situations that don’t encourage independent thinking.
Test for the presence of stated and unstated rules that unnecessarily restrict freedom of speech, action, and association. By subtly violating some rules and roles and then observing the consequences, we may discover how much latitude is allowed for idiosyncrasy in the system, for eccentric or creative self-expression.
Resist the lure of uniforms and other disguises that make you look like one of the bunch.
Develop a sense of humor about yourself to minimize utter saturation in your role in the system, to retain a creative view of your situation, and to gain some experience dealing with your apparent weaknesses without undue anxiety.
Listen to criticisms of your most cherished beliefs and institutions. Know them, but don’t accept them uncritically. Allow yourself to confront the issues so you can carefully gauge their merit, and perhaps see events not only as the system you are in expects you to see them, but “as they are.”
Retain your sense of individual integrity in the system by calling other by name and referring to yourself by name. If people are typically referred to by title, try adding their first or last name to the conventional address, abbreviating it casually, or somehow reformulating the typical approach so that it draws upon them as human beings instead of as objects that merely serve instrumental ends.
Make an effort to discover the person behind the role, to respond to someone’s uniqueness, rather than to a stereotyped role impression.
Disclose personal observations about your surroundings and about experiences you’ve had elsewhere to those you feel might share your views. Elicit feelings and ideas from them so that together you can disengage the “scripts” that specify the basic, unquestioned rules of the setting.
Remember that ignoring social roles is not easy and is sometimes met with censure. Thus, it is important to be careful. The more rigidly structured our social role enactments, the less ambiguity people face in the social world. Accepting a certain amount of ambiguity, however, is the crux of spontaneity and flexibility.

When a group of people becomes more preoccupied with seeking and maintaining unanimity of thought than with carefully weighing the pros and cons of alternative actions, raising moral issues, and critically appraising decisions, unanimous resolutions are often reached prematurely. As part of the package, members may be led to support these decisions for better or for worse. When tightly-knit groups are insulated from outside sources of information and expertise and their leaders endorse prospective policies before members have a chance to air their views, decision-making processes deteriorate. Studies of the dynamics of Presidential cabinet meetings during the Johnson and Kennedy administrations revealed just this pattern. The Bay of Pigs fiasco was but one of the blundering outcomes. Psychologist Irving Janis termed the process “groupthink.”

Furthermore, when people are isolated from those they care about, from their sense of self-continuity, they begin to feel amorphous and uprooted, and the process renders them more susceptible to the hands of makers-over. Isolating feelings from intellectual concerns serves a similar function. Persuaders bring us to their place of power, separate the good or aware “us” from the evil or ignorant “them,” and then proceed to limit our access to ideas that they find heretical, traitorous, or not in their best interests. This can be true of interpersonal relationships just as it can be true of memberships in social institutions, groups, or organizations.

When we are isolated from outside information, it is impossible to make unbiased decisions. Police interrogators question suspects at the station, not at their homes. Synanon rehabilitates alcoholics and drug addicts (and keeps its other members in line) by removing them from their usual haunts and restricting their liberty. Jim Jones isolated People's Temple members in the jungle of a strange land. When we come to believe so thoroughly in our favorite concepts that we begin to hate those who don’t share our views, to develop rehearsed programmatic responses to discrediting arguments, and to acknowledge only ideas stated within our own terminology, it may be time to start making our belief systems a little more permeable. Nothing is so simple as the labels “good” and “evil” suggest. Moreover, they foster utter vulnerability to the system that is termed “good.”
Try to establish whether you can actually have an impact upon decision-making processes in a relationship or group, or whether you are simply part of the clean-up crew for decisions that have already been made. Watch for premature closure or initial consensus while discussing an issue. What arbitrary constraints are placed on the alternatives to be considered? Do rigid procedural devices limit discussion and suppress unusual suggestions?
Refuse to accept the “we-they” dichotomy that cuts you off from outsiders and suggests you should think of them with dehumanizing labels like animals, sinners, queers, rednecks, women’s libbers, teeming masses, and so on.
Suspect appeals that encourage you to detach your feelings from the rest of your being; assert the harmony of mind-body, intellect and emotion, past and present.
Try to encourage independent thinking among group members (as suggested by the strategies in the previous section). Solidify channels of feedback between members, between members and leaders, and from outside evaluators to the group.
Remember that the minority may at times have the only accurate view of the issues. Any worthwhile group should tolerate dissent or be abandoned.
Allow yourself to question commitments if they are no longer appropriate for you. Consistency in the face of contrary evidence is usually not a virtue but a sign of rigidity, delusion, or prejudice. Make an effort to admit past errors and to acknowledge old beliefs and commitments that proved limiting for you.
Continually seek outside information, reality checks, and critical appraisals of what you are doing.
Maintain outside interests and sources of social support and reject the appeal that devotion to the cause requires severing ties to outsiders. Battered wives, religious converts, undercover agents, mafia informants, and inmates of prisons and mental hospitals all suffer from impoverished connections to outside systems.
Family and friends should leave the path back home open. Your unconditional accessibility to those who have strayed, no matter what they’ve done or said, may be hteir only hope. Disowning children, friends, or relatives when you disapprove of their decisions is much less effective in the long run than a gentle hand and some warm words. “Love-bombing” is the favorite tactic of most cults, because it works best among the love-deprived, those to whom we have not given love.

The tighter a system is, the more likely that minor challenges will be met with retaliation. In prisons, mental hospitals, religious or political cults, military establishments, concentration camps, and so on, some people have virtually total control over the existence of others, and minor deviations or threats to that power are intolerable. If we come to threaten the structure of a coercive system, it is likely to retaliate by pursuing the tactic, “divide and conquer,” or perhaps, even the opposite, i.e., “promote.” By giving us status and responsibility, the system arranges that our needs no longer run at cross-purposes to those of the system. But when maintaining the status quo is not palatable, the main question is whether changing the system is feasible. Those who survived Jonestown did so by escaping its grasp. And some systems have time on their side; they can wait out the opposition and have their officers paid for doing so. Supporters of the status quo are employed while those who oppose it do so as outsiders, part-time and struggling to make ends meet. In any case, it is often more practical to challenge systems from without – especially by forming other systems.
Don’t let your silence pass for agreement with the system. While talking to others, subtly imply your discontent in areas where you think they might agree. Avoid incriminating yourself completely in the face of their utter resolve by intuiting their responses as you speak and overstepping only those rules that are of least concern to the system.
Once you establish a group of allies and decide that you cannot escape the system or that you are committed to changing it, band together in opposition so that yours will be a position to be acknowledged rather than a disposition to be “treated.” A consistent minority, firm in its conviction, can often undo a majority. Of course, one must then become vigilant to the potential development of mind control tactics within the new minority system.
Begin by assessing the power base of those who hold the reins. Find substitutes for the resources power holders threaten to withhold from you. Do you really need the attention, respect, security, approval, money, or whatever these particular people have to offer? Then, by determining what contributions you make to the system that are important to its functioning, you and your allies can collect a significant repository of such resources to withhold from the system when bargaining time arrives. Citizens’ action, organized labor, the women’s movement, and so on, base much of their strategies on such decisions.
Appeal to the same human needs that the powerholders in the system manipulate in others. If they are to reconsider their position, they must be led to do so on their own terms, or effective coercion must prove that their terms are no longer tenable. Learn to negotiate with powerholders using your resources. Collective resistance by a group that states its problems concisely, specifies clear and concrete goals, resources, and strategies is infinitely more likely to be successful than are disorganized revolts and spit-and-run tactics.
Exit those situations in which disobedience is likely to be futile and punishable, if you can. Escape plans must be carefully thought through in concrete terms, not wished about vaguely, and it is best not to go alone. Furthermore, public exposes are essential if the veil of secrecy that conceals mind control practices in all of their varied forms is to be lifted. Jeannie Mills, defector from the People's Temple and co-founder of the Human Freedom Center in Berkeley, was unable to get people to believe her horrendous tales of Jim Jones’ brutality and deceit until she persuaded several reporters to check out the discontinuities between his preaching and his practice. It takes a firm sense of social commitment to excape a system of mind control, and to then persist in challenging it from without. Nonetheless, buyers do well to beware: “Every exit is an entry somewhere else.” (Tom Stoppard, Rosencrantz and Guilderstern are Dead).

Aware citizens must be diligent in seeking out and utilizing all sources of information relevant to their decisions and in helping to make this information available to others. The prescription is not that one ought to become obnoxious and belligerent, but that one must exercise one’s capacity for careful thought and analysis. It is because we can exercise our ability to critically evaluate ideas, institutions, and our own behavior that we can perceive options beyond those provided by convenient dogma and ostensibly inescapable circumstance. In this way are we “free” to make meaningful choices and to not be controlled.
References

Adler, W. (June 6, 1978). Rescuing David from the Moonies. Esquire, 23-30.

Anson, R. S. (November 27, 1978). The Synanon horrors. New Times, 28.

Bandler, R., & Grinder, J. (1975). The structure of magic. 2 vols. Palo Alto, Calif.: Science and Behavior Books.

Becker, E. (1975). Escape from evil. New York: The Free Press

Bem, D. J. (1970). Beliefs, attitudes, and human affairs. Belmont, Calif.: Brooks/Cole.

Brehm, J. W. (1966). A theory of psychological reactance. New York: Academic Press.

Caplan, G. A. (November, 1969). A psychiatrist’s casebook. McCall’s, 65.

Conway, F., & Siegelman, J. (1978). Snapping: American’s epidemic of sudden personality change. Philadelphia: Lippencott.

Festinger, L. & Maccoby, N. (1964). On resistance to persuasive communication. Journal of Abnormal and Social Psychology, 68, 359-367.

Flacks, R. (1973). Conformity, resistance, and self-determination: The individual and authority. Boston: Little, Brown.

Franks, J. D. (1961). Persuasion and healing. Baltimore: Johns Hopkins Press.

Freedman, J. L., & Sears, D. D. (1970). Warning, distraction, and resistance to influence. Journal of Personality and Social Psychology, 1, 262-265

Gusfield, J. R. (Ed.) (1970). Protest, reform, and revolt. New York: Wiley.

Goffman, E. (1971). Relations in public: Microstudies of the public order. New York: Harper and Row.

Haney, C. & Zimbardo, P. G. (1977). The socialization into criminality: On becoming a prisoner and a guard. In J. L. Tapp & F. J. Levine (Eds.), Law, justice and the individual in society: Psychological and legal issues.

Hartman, J. J. (1979). Small group methods of personal change. In M. R. Rosenzweig and L. W. Poreter (Eds.), Annual Review of Psychology, 1979, Vol. 30. Palo Alto, Calif.: Annual Reviews, Inc.

Hinkle, L. E., & Wolff, H. G. (1956). Communist interrogation and indoctrination of “enemies of the state.” American Medical Association Archives of Neurology and Psychiatry, 76, 115-74.

Hirschman, A. O. (1970). Exit, voice, and loyalty: On recovery from decline in firms, organizations and states. Cambridge: Harvard University Press.

Janis, I. L. (1972). Victims of groupthink. Boston: Houghton-Mifflin.

Kadish, M. R., & Kadish, S. H. (1973). Discretion to disobey: A study of lawful departures from legal rules. Stanford, Calif.: Stanford University Press.

Lifton, R. J. (1961). Thought Reform and the psychology of totalism. New York : W. W. Norton

London, H. (1973). Psychology of the persuader. Morristown, N.J.: General Learning Press.

Markman, E. (October, 1978). Comprehension monitoring. Paper given at the Conference on Children’s Oral Communication Skills, University of Wisconsin.

Marks, J. (1979). The search for the “Manchurian Candidate”: The CIA and mind control. New York: times Books.

McGuire, W. J. (1968). The nature of attitudes and attitude change. In G. Lindzey & E. Aronson (Eds.), Handbook of social psychology, Vol. 3. Reading, Mas.: Addison-Wesley.

Milgram, S. (1974) Obedience to authority. New York: Harper and Row.

Mills, J. (1979). Six years with God: Life inside Reverend Jones’ Peoples Temple. New York: A & W Publishers

Patrick, T. (March, 1979). Playboy interview. Playboy, 53ff.

Polsky, N. (1967). Hustlers, beats and others. Chicago: Aldine.

Professional Salesman’s Desk Manual. (1976). Waterford, Conn.: National Sales Development Institute, Division of BBP.

Roberts, D. F., Gibson, W. A., Christenson, P., Moser, L., & Goldberg, M. E. (August, 1978). Innoculating children against commercial appeals. Paper presented at the annual meeting of the American Psychological Association, Toronto.

Roberts, D. F., & Macoby, N. (1973). Information processing and persuasion: Counterarguing behavior. In P. Clark (Ed.), New models for mass communication research. Sage Annual Reviews of Communication Research, Vol. 2. Beverly Hills, Calif.: Sage.

Schein, E. H., Schneier, I., & Barker, C. H. (1961). Coercive persuasion. New York: Norton.

Schiffer, I. (1973). Charisma: A psychoanalytic look at mass society. Toronto: University of Toronto Press.

Schrag, P. (1978). Mind Control. New York: Pantheon Books.

Schwitzgebel, R. L., & Schwitzgebel, R. K. (Eds.). (1973). Psychotechnology: Electronic control of mind and behavior. New York: Holt.

Singer, M. T. (January, 1979). Coming out of the cults. Psychology Today.

Stoner, C., & Parke, J. (1979). All God’s children. New York: Penguin.

Szasz, T. (1976). Patty Hearst’s conversion: Some call it brainwashing. The New Republic, 174, 10-12.

Tedeschi, J. t. (Ed.), (1972). The social influence processes. Chicago: Aldine.

Varela, J. (1971). Psychological solutions to social problems: An introduction to social technology. New York: Academic Press.

Vohs, J. L., & Garrett, R. L. (1968). Resistance to persuasion: An integrative framework. Public Opinion Quarterly, 32, 445-452.

Wicklund, R. A. (March-April, 1979). The influence of self-awareness on human behavior. Scientific American, 187-93.

Zimbardo, P. G. (June, 1967). The psychology of police confessions. Psychology Today, 1, 17-27.

Zimbardo, P. G., Ebbvesen, E. B., & Maslach, C. (1977). Influencing attitudes and changing behavior. 2nd edition. Reading, Mass.: Addison-Wesley

* * * * * * * * * * * * *
This paper is an abridged version of a report prepared for the Office of Naval Research (Z-79-01). Address reprint requests to Dr. Andersen, Department of Psychology, University of California, Santa Barbara, CA 93106.

Susan M. Andersen, Ph. D. is an Assistant Professor of Social-Personality Psychology at the University of California, Santa Barbara. Dr. Andersen received her Ph. D. from Stanford University in 1981, and her vita includes publications on such topics as self-definition, sex-role behaviors, psychopathology, and the nature and functioning of religious cults.

Philip G. Zimbardo, Ph. D. is a Professor of Psychology at Stanford University. Dr. Zimbardo, holder of many academic awards, has published widely in the field of social psychology, including books and articles on attitude change, cults, and shyness.
Cultic Studies Journal, Vol. 1, No. 2, Fall/Winter 1984