Articles‎ > ‎

Arousal Capacity and Intense Indoctrination

Cultic Studies Journal, Vol. 18, 2001. 

Arousal, Capacity, and Intense Indoctrination

Robert S. Baron

This article considers the process of intense indoctrination, specifying procedural conditions, internal states, mechanisms of social influence, and key output behaviors associated with extremely manipulative and coercive programs of attitude and value change. Most descriptions of intense indoctrination point out that emotional arousal and stress are integral features of such programs of systematic persuasion. This article focuses on the hypothesis that this arousal, coupled with other features of the indoctrination process, compromise the attentional capacity of indoctrinees and that this impairment of attentional capacity increases the impact of several social influence mechanisms in such settings. The research evidence relevant to this hypothesis is reviewed.

Changes came over me subtly … in time although I was not aware of it, they had turned me around completely. I had thought I was humoring them by parroting their clichés and buzz words without personally believing in them. Then … a sort of numbed shock set in. To maintain my own sanity and equilibrium while living and functioning day by day in this new environment, I had learned to act by rote … suspending disbelief. (Hearst, 1982, p.185)

In the last 20 years, the public has grown increasingly aware that certain religious and philosophical groups have developed indoctrination procedures that have extraordinary impact. These groups have persuaded young adults to cut off all contact with family; to accept vows of poverty; to devote extremely long hours to prayer, meditation, fundraising and recruitment; and to forsake promising careers and educational opportunities (e.g., Galanter, 1989; Hassan, 1988; Singer, 1995). The most dramatic examples of the power of such indoctrination undoubtedly are cases of group suicide that have punctuated the news from time to time. Thus, one can point to the tragedy at the Jonestown settlement of the People’s Temple, which claimed 914 lives in 1978; the suicidal resistance at the Branch Davidian compound in Waco, Texas, in 1992; and group suicides among members of the Order of the Solar Temple and Heaven’s Gate sects in the 1990s as indications of the persuasive power of group indoctrination. The dramatic transformation of Patty Hearst after being captured by the Symbionese Liberation Army (SLA) in February 1974 represents yet another vivid example of effective indoctrination.

In this article I review classic instances of intense indoctrination, outlining procedural events, intervening states, social influence processes, and output variables. The analysis focuses particularly on the debilitating impact that indoctrination procedures have on attentional capacity and how this, in turn, affects several basic social psychological and cognitive processes integral to persuasion and behavior change. This approach complements and updates early conceptualizations of intense indoctrination and provides the framework for a systematic and in-depth discussion of research findings in the areas of attitude change, group process, stereotyping, and human cognition. Moreover, in this discussion I expand on earlier treatments of this process (e.g., Hassan, 1988; Pratkanis & Aronson, 1992; Singer, 1995) by carefully examining the extent to which the experimental evidence supports the view that attentional depletion and other related internal states (e.g., emotion) exacerbate specific persuasive processes.
Early Analyses of Intense Indoctrination

Almost all early descriptions of intense indoctrination acknowledged that such indoctrination involves an initial period of psychological and physical stress (e.g., Lifton, 1961; Sargant, 1957; Schein, Schneier, & Barker, 1961). However, these explanations varied regarding why this stress alters values, behaviors, and senses of self. Some writers simply assumed that a stress-induced state of hypnotic-like confusion was responsible for the resulting changes, whereas other writers took a psychoanalytic perspective, arguing that stress and exhaustion weaken ego strength and elevate dependency needs and guilt, thereby leading the indoctrinee to identify with the indoctrinating agent (e.g., Moloney, 1955). Other writers assumed that conditioning principles were responsible for changes wrought by indoctrination. Sargant (1957), for example, referring to Pavlov’s (1927) canine stress research, argued that the stress of indoctrination led to a state of ultraparadoxical inhibition in which a given evaluative response was replaced with its opposite reaction. In Pavlov’s research, this form of inhibition was inferred when well-conditioned dogs salivated “inappropriately” to negative discriminatory stimuli following extreme stress. Sargant suggested that an analogous process explained the dramatic changes in beliefs and attitudes observed following stressful indoctrination.

Lifton (1961) focused on changes in sense of self in his classic discussion of Chinese communist intense indoctrination techniques. Lifton emphasized how the various highly coercive procedures used by the Chinese effected changes in identity and self-definition. Thus, Lifton conceptualized indoctrination as a process that manipulated guilt, shame, and anxiety to produce a “death” of the original self and a “rebirth” of a reeducated self. Lifton focused on the act of confession as a key procedural element of indoctrination, viewing it as a direct assault on the adequacy of the self-concept and a major source of anxiety and guilt (see also Ofshe & Singer, 1986). Although these models are interesting historically, a model outlined by Schein et al. (1961) is more relevant to the analysis I present below. Whereas early models tended to emphasize a specific mechanism of change, Schein et al. (1961) suggested that an eclectic variety of mediating mechanisms produced changes in beliefs, values, and self-conceptions. In addition, Schein et al. (1961) explicitly acknowledged that although indoctrination altered self-conception, this change, in turn, depended on initial changes in beliefs, values, attitudes, and behaviors (e.g., p.195). In this respect, this account took on a decidedly social psychological perspective. Accordingly, Schein et al.’s (1961) model was unique among early approaches in referring to mechanisms such as cognitive dissonance, conformity, interpersonal communication, and cognitive and semantic organization.

Schein et al. (1961) argued that the indoctrination process involved three stages: unfreezing, changing, and refreezing. Unfreezing was envisioned as a weakening of the stable equilibria that supported the individual’s beliefs and attitudes, especially those concerning the self and influencing agents. Emotional stressors such as fear, guilt, social ostracism, and inner conflict as well as physical stressors such as inadequate diet, sleep deprivation, and the use of physical restraints (handcuffs) were thought to contribute to unfreezing. Changing involved altering the cognitive structure of the indoctrinee. This stage referred to the “mental operations” involved in changing attitudes, beliefs, and semantic systems as well as self-conception. Controlled information, cognitive dissonance, social influence from peers and authority figures, and social identification processes were deemed crucial here, leading to a willingness to uncritically consider and understand the doctrine and interpretations proffered by the group. Finally, refreezing involved stabilizing the changes wrought by the indoctrination process. Schein et al. (1961) believed that this involved integrating new beliefs and values into the overall personality. They argued that interpersonal confirmation and social reinforcement by others played a crucial role in this process. Schein et al.’s (1961) emphasis on social confirmation strongly echoed Sargant’s (1957) thoughts regarding the factors affecting persistence of indoctrination-produced changes, a process Sargant referred to as consolidation.

Schein et al.’s (1961) framework serves as a historical precursor to the analysis presented below in several respects. Schein et al.’s (1961) model considered a variety of conceptual mechanisms as agents of change rather than focusing on a single process. It emphasized that changes in belief, attitude, values, and behavior are key outcomes of indoctrination that are necessary to produce the changes in self-conception that characterize successful instances of thought reform. The model viewed indoctrination as a sequential process moving through several stages and, finally, examined the relevant experimental evidence in social and cognitive psychology. In the analysis presented in this article, I extend this approach by paying particular attention to interactive dynamics between emotional states, attentional capacity, and social psychological phenomena.
Process of Indoctrination

This analysis began with extensive library research on classic instances of intense indoctrination introduced by the communist regimes in the Soviet Union and China in the 1930s and 1940s (e.g., Hinkle & Wolff, 1956; Lifton, 1961; Sargant, 1957; Schein et al., 1961). The analysis was also influenced by an informal series of conversations with approximately 40 former members of cult organizations conducted between 1979 and 1990[1] as well as by written reports by psychologists and others specializing in the treatment of ex-cult members (e.g., Galanter, 1989; Hassan, 1988; Singer, 1979). Many components of the framework outlined here have been tested in a series of laboratory experiments focusing on both group persuasion and the impact of stress on persuasion and social perception. In this article, I review that research as well as other relevant findings.

As several others have noted (e.g., Lifton, 1961; Schein et al., 1961), the indoctrination process can be envisioned as a series of stages. Although such distinctions are somewhat arbitrary, in this analysis I characterize the indoctrination process as having four stages as opposed to the three stages specified by Schein et al. (1961). These are labeled as the softening-up stage, compliance stage, internalization stage, and consolidation stage. These stages are discussed in more detail in the next section. Following that, the analysis lists procedural events commonly found in classic cases of intense indoctrination, various emotional and cognitive intervening states likely to be caused by these procedures, and a number of social psychological processes likely to affect attitudes and values given these procedures and reactions. The various procedural events and process features specified here are largely a synthesis of those outlined in prior accounts (e.g., Lifton, 1961; Schein et al., 1961). However, I expand on prior work by considering the possibility that the emotional and cognitive states alluded to previously interact with (i.e., exacerbate) the social psychological processes thought to underlie the belief and attitude changes produced by intense indoctrination. Finally, I specify a number of output behaviors that are typical of successful indoctrination. These output behaviors have the following features: (a) they reflect a radical departure from previous values, attitudes, and behaviors; (b) they seem to be emitted freely at the moment (i.e., they are not the result of any current physical or material threat); and (c) they involve substantial costs for the indoctrinee. This cost can involve such things as negative publicity, violation of prior commitment, time, money, lost opportunity, or injury to self or loved ones.

Stages of Intense Indoctrination

A variety of tactics are used to recruit individuals to totalist groups. For cases in which new recruits are volunteers (e.g., most cases of cult recruitment), these tactics can include such things as repeated personal contact, group meals, lectures, and weekend retreats (Hassan, 1988). During these initial contacts, the recruits may be showered with attention and praise and are likely to be carefully “squired” by enthusiastic group members. After these initial contacts, strong efforts are made to physically separate the recruits from their normal environment. This is often accomplished by moving recruits to a secluded setting. In cases involving coercion, of course, abduction and arrest serve to locate the indoctrinee in an indoctrinating context. Following Hassan’s suggestion, I reserve the term coercive persuasion for such forceful indoctrination contexts. The major distinguishing feature of such coercive settings is that initial stress levels tend to be higher and more salient to indoctrinees given the greater threats to their person and freedom. A surprising number of other features, however, are common to both voluntary and coercive forms of intense indoctrination, such as physical and social separation, changes in diet, sleep deprivation, peer pressure, and emotional manipulations. Once recruits are separated from their normal social context, the indoctrination process unfolds.

Stage 1. In the softening-up stage, the indoctrinee is typically isolated from friends and family. Efforts are made to keep the indoctrinee confused, excited, tired, disoriented, and, sometimes, abused and frightened. This period can be as brief as a few hours, although it usually is longer. Stress levels are most pronounced in the more coercive instances of intense indoctrination. Patty Hearst (1982), as an example, was held handcuffed in a dark closet for more than four weeks; she was graphically and repeatedly threatened with death and exposed to various other forms of severe psychological and physical abuse during this period of confinement.

Ms. Heart’s experience corresponds in a disturbing way to isolation procedures used originally by Stalin’s NKVD in some of the first cases of systematic indoctrination ever reported. According to an excellent description by Hinkle and Wolff (1956), the Soviet secret police used particularly intense pressure to “break” new prisoners. Soviet prisoners served months in solitary confinement, during which they were not allowed contact with anyone except the interrogator. Isolation was so complete that when prisoners passed each other while being escorted through the halls, they had to turn their heads from each other and look at the corridor wall. Lights were lit continually, prisoners were deprived of sleep, cells had no windows, and interrogation, if it occurred at all at this point, took place at irregular intervals. All of this made it difficult for the prisoners to maintain time orientation. If prisoners asked what they were charged with, they either were ignored or were told that the State made no mistakes, knew their crimes, and expected the prisoners to show their sincerity by confessing to what the State already knew. In short, prisoners were given the oftentimes impossible chore of figuring out why they were being “rehabilitated.” In addition, prisoners’ early attempts at confessing and giving information were generally rejected as inadequate or insincere. As a variation on this technique, Chinese communist procedures of the 1940s and 1950s typically involved abusive social pressure from more “advanced” prisoners (coupled with the use of manacles) in lieu of solitary confinement as a key means of elevating stress.

These police procedures represent an extreme instance of softening up an indoctrinee. More subtle procedures, however, can also be effective at generating the stress necessary to prepare individuals for effective indoctrination. For example, Singer (1995), describing cult indoctrination, referred to changes in diet, appearance, sleep, arousal, and social context as effective stressors (see also Galanter, 1989). The general point is that some period of initial stress apparently increases the effectiveness of an intense assault on an individual’s attitudes, beliefs, and values.

Stage 2. In the compliance stage, the recruit tentatively “tries out” some of the behaviors requested by the group, more or less going through the motions or paying lip service to many of the demands made by the group. Often, the recruit views this as a period of exploration to see what the group is like or what such compliant behavior nets him or her. In other cases, compliance occurs in response to social pressure. Although curiosity and politeness account for some acts of compliance, other instances are induced by well-known compliance techniques, such as reciprocal concessions procedures, appeals to authority, and group pressure (Cialdini, 1993). Finally, in coercive settings, individuals often comply in an attempt to reduce threatening or aversive aspects of the situation.

Stage 3. In the internalization stage, the recruit starts to consider aspects of the group belief system. This can be triggered by various mechanisms including curiosity, persistent social pressure, and the need to justify prior compliance. This stage, analogous to Schein et al.’s (1961) changing stage, can be completed within a week in some cases. In this stage, standard theories of social influence and persuasion (e.g., Cialdini, 1993; Petty & Cacioppo, 1986) became applicable in that the individual reevaluates old beliefs and considers new ones.

Stage 4. In the consolidation stage, the recruit solidifies his or her newly acquired allegiance to the group. This may entail making various costly behavioral commitments that are hard to undo (e.g., donating one’s personal possessions to the group or recruiting new members), isolating oneself from nongroup members, or selective exposure to information. This final stage of indoctrination is marked by the recruit’s total acceptance of group doctrine and policy with a minimum of close examination. In this stage, the primary reaction of the recruit to negative information about the group is denial and rationalization. Thus, events and information are selectively interpreted and attended to. As this implies, cognitive dissonance mechanisms appear to be highly relevant in this stage. Individuals at this stage are dominated by what Chaiken, Liberman, and Eagly (1989) referred to as defense motivation, when they process attitude-relevant information (i.e., the desire to hold or defend a specific attitudinal position). For this reason, the indoctrinee who reaches the consolidation stage will be highly resistant to persuasion from those outside the group.

Comments. The four stages outlined here obviously draw heavily from prior analyses. For example, Sargant (1957) discussed the consolidation notion as early as 1957. Similarly, Schein et al.’s (1961) three stages of unfreezing, changing, and refreezing are very similar to the four-stage framework. The major difference between the two is that Schein et al.’s (1961) unfreezing stage is subdivided by the four-stage framework into the softening-up and compliance stages to more explicitly reflect the activities and processes present in the early phases of indoctrination. An important caveat in this discussion of stages is that they are intended to serve as guidelines rather than inflexible and mutually exclusive partitions. Stages will vary in duration from case to case, and each prior stage is thought to blend into the next. Moreover, some rare individuals remain with their groups while never making the transition into the consolidation stage or, alternatively, they may slip in and out of this stage. This may lead such individuals to experience conflict and doubt akin to “crises of faith” seen in various religions.
Procedural Features of Intense Indoctrination

Standard reference works regarding indoctrination (e.g., Schein et al., 1961; Singer, 1995) indicate that a series of procedural features are commonly observed. One subset of these procedures appears to contribute directly to the softening-up stage described previously. These include social disruption (i.e., separation from friends and family), physical stress (especially sleep loss and altered diet), fear or guilt manipulations, regimented daily activity schedules, alteration in appearance (clothing, posture, hairstyle), and carefully orchestrated social pressure. Additional procedural features include public self-criticism or confession, repetitive mental activity (e.g., meditation, memorizing doctrine), the presence of strong authority figures, a “messianic” group purpose (from which group members can derive a sense of importance), stereotypical depiction of nonmembers as evil or misguided, escalation of commitment in which the recruit is asked to engage in increasingly costly behavior over time, and censorship of information. Although I have identified certain procedures as contributing to the softening-up stage, this is not meant to imply that they only have effects at this stage. As shown subsequently, it seems likely that certain of these procedural events (e.g., physical stress) may contribute to reactions at various stages.
Indoctrination and Internal States

The procedures outlined in the previous paragraph are commonly thought to produce a variety of strong internal reactions during intense indoctrination attempts (e.g., Hassan, 1988; Lifton, 1961; Singer, 1995). These states include guilt, fear (or anxiety), confusion, dependency, depleted attentional capacity (attributable to sleep loss, malnutrition, emotionality, and high rates of activity), disassociative states provoked by chanting or meditation, and low self-esteem attributable to imposed self-criticism and requirements to learn the often inscrutable doctrine of the group.
Mechanisms of Internalization

The reactions of indoctrinees during the softening-up and compliance stages of indoctrination do not require elaborate commentary. Individuals react to stress, curiosity, or social pressure in fairly predictable ways. Similarly, a number of well-known persuasion processes can account for attitude and value change observed in the internalization stage. Thus, various recent accounts of intense indoctrination identify mechanisms such as conformity processes, desires for group acceptance, heuristic message processing, group polarization, group think, stereotyping (of outgroup members), foot-in-the-door processes, and cognitive dissonance mechanisms as important mediators of attitude and value change in these manipulative settings (e.g., Pratkanis & Aronson, 1992; Singer, 1995). Although a number of writers allude to such social psychological processes in their discussion of intense indoctrination, a number of other issues remain intriguing or controversial. One set of issues concerns the processes involved in the attitude consolidation phase. A second set of issues concerns the likelihood that standard social influence processes have greater impact on recruits because of the internal states typically generated during intense indoctrination. This “enhanced impact” is a major focus of the current treatment. A key assumption here is that the internal states produced by intense indoctrination impair attentional capacity, thereby dramatically enhancing the effectiveness of various social psychological processes. It is my contention that this interactive process is a major reason why intense indoctrination can be viewed as a qualitatively unique form of social influence. By explicitly considering how such interactive dynamics affect compliance, internalization, and consolidation, this analysis complements and extends prior treatments of intense indoctrination. These various issues are covered in the following sections.
Mechanisms of Consolidation

During the consolidation stage, the recruit comes to uncritically accept the various aspects of group policy and doctrine. In this stage, attitudes are held with such tenacity that contradictory evidence is generally explained away. Cognitive dissonance theory provides one compelling explanation for such reactions. According to this view, the disciple’s need to justify the costly and often irrevocable behavioral commitments that escalate over the course of intensive indoctrination leads to the development of extreme and resistant attitudes. For example, several classic case studies of doomsday groups (Festinger, Riecken, & Schachter, 1956; Hardyck & Braden, 1962) provide cogent analyses of how cognitive dissonance mechanisms can contribute to group loyalty even in the face of extremely dramatic disconfirmations of group doctrine and prophesy. A second (cognitive miser) hypothesis I suggest regarding the development of attitude consolidation is that eventually, group members find it quite effortful to continually agonize over whether the group’s ideology is correct or justified. Both the individual’s limited attentional capacity (Kahneman, 1973) and attentional fatigue (Cohen, 1978) should discourage the individual from prolonged consideration of group doctrine. Consider the experience of Patricia Hearst (1982):

Once I came to accept in my own mind the stark reality of my new life—that I was now a part of the SLA…the racking turmoil within me subsided. My everyday life became somewhat easier. All I had to do was to go along with them and that became easier day by day. (p.164)

In addition to this cognitive miser view and the dissonance interpretation of attitude consolidation, I add Sargant’s (1957) argument that social confirmation contributes to the intense attitude persistence created by indoctrination (see also Schein et al., 1961). Sargant argued that confirmation of one’s views by others increased the probability that the new attitudes would become well integrated into one’s sense of self and the other attitudes and values in one’s cognitive system. As a result, these attitudes were expected to be costly to change. A number of well-established research findings support these original speculations. First, research on various forms of attitude involvement indicates that ego-involving attitudes are indeed relatively resistant to change (Johnson & Eagly, 1989). The research on social identification makes a related point. This research assumes that group membership often will be a key element in self-definition and self-esteem (e.g., Mackie & Cooper, 1984; Wood, Pool, Leck & Purvis, 1996). As predicted, this research indicates that attitudes linked to valued groups are resistant to attack especially from outgroup members (David & Turner, 1996). In addition, Pool, Wood, and Leck (1998) reported that individuals experience lowered self-esteem when they learn that their opinions are contradicted by valued in-group members. This, of course, supports the view that there are self-related reasons for individuals to adhere to (attitudinal) group norms in cases in which they strongly identify with their groups.

Also in accord with Sargant’s (1957) social confirmation view, the group interaction literature indicates that discussion between like-minded individuals (one form of social confirmation) generally provokes polarization of attitude (Isenberg, 1986) as well as heightened confidence (Janis, 1972; Stasser, Taylor, & Hanna, 1989). Indeed, full discussion may not even be necessary to provoke such changes. Baron, Hoppe, Linneweh, and Rogers (1996) recently reported a series of studies indicating that when individuals learned only that others agreed with their judgments and opinions (without knowing why), they increased the extremity of these responses and felt greater confidence regarding these more extreme views (see also Luus & Wells, 1994).
Interactive Processes in Indoctrination

In short, a variety of mechanisms are capable of producing the attitude consolidation outlined in this model. A second set of interesting issues concerns the extent to which the internal states described here interact with these processes and those listed as causes of compliance and attitude internalization. This is discussed in the following sections.
Interactive Dynamics and Compliance

Although there are not abundant data addressing how compliance is affected by internal states such as fear, self-esteem, and attentional capacity, there are reasons to expect strong relations. For example, the feelings of guilt and low self-esteem should heighten the effectiveness of requests to publicly criticize (and confess) one’s past actions that so often characterize intense indoctrination. In the same vein, attentional depletion should leave one more susceptible to the compliance manipulations generally used to elicit behavioral commitment. Cialdini (1993) outlined a number of creative strategies for resisting compliance procedures. These strategies all entail mindful and sophisticated awareness of subtle social pressures integral to such techniques. Such awareness is likely to be impaired when attentional capacity is limited. Dolinski and Nawrat (1998) recently reported that individuals were far more likely to comply with various requests (for money, experimental participation, or charity work) soon after a brief period of fear. In accord with this argument, they attributed this “fear, then relief” compliance procedure to the depletion of attentional capacity that is thought by many to occur soon after a person experiences strong emotion (e.g., Cohen, 1978).
Interactive Dynamics During Internalization

Arousal, capacity, and superficial message processing. The change in private beliefs that occurs during the internalization stage is hypothesized to be affected by the amount of attentional capacity available to individuals during intense indoctrination. Persuasion researchers agree that one way of resisting a flawed persuasive attempt is to carefully evaluate the merits of the message. In cases of indoctrination, individuals are often cajoled into violating their values, engaging in costly behaviors, or committing themselves to irrevocable decisions in service of fanciful, paranoid and unverifiable doctrines. For example, the members of the SLA committed a variety of very public criminal actions, convinced that they were at the brink of triggering and leading a mass “people’s revolution” across the United States. One would think that careful processing of such teachings would draw attention to faulty logic, incomplete verification, undiscussed consequences, or erroneous information. However, careful, systematic processing requires a good deal of effort and concentration (Chaiken, 1987; Petty & Cacioppo, 1986). People who are debilitated because of malnourishment, sleep deprivation, or overwork should be less capable of carefully processing message characteristics. This, in turn, should heighten the impact of peripheral cues such as audience response, speaker confidence, or emotional manipulations. In short, the fact that a variety of indoctrination procedures deplete attentional capacity may explain why individuals in such settings so often appear to accept even fanciful aspects of group doctrine with such little critical objection.[2]

A variety of findings indicate that this attentional capacity prediction has validity. It is congruent with research that documents that distraction and time pressure produce less careful message processing (e.g., Baron, Baron, & Miller, 1973; Kruglanski, 1989). The strong arousal generated during intense indoctrination represents another factor that may increase the likelihood of heuristic processing. A number of writers have reviewed evidence showing that emotional arousal depletes attentional capacity (e.g., Eysenck, 1977). Given that careful message processing is presumed to occur only when one has the necessary capacity and motivation, Baron (1986) argued that if arousal lowers such capacity, it should decrease the likelihood of elaborate message processing (see Bodenhausen, Sheppard, & Kramer, 1994, for a related view).

This argument, in fact, echoes early theorizing by Sargant (1957), who argued on the basis of anthropological and historical observations, that emotional excitement somehow disrupted critical thinking and caused the ultraparadoxical reversal of previously conditioned preferences referred to earlier. More interesting, Sargant suspected these effects were caused by some disturbance of cerebral function. He presciently suggested that a form of cortical reciprocal inhibition (pp. 43, 55) may be involved, a position espoused some 16 years later by Walley and Weiden (1973).

Until recently there was not a great deal of data relevant to the idea that strong emotion would heighten superficial message processing, but in the last few years a number of studies have reported support for this view. Sanbonmatsu and Kardes (1988) reported that a step exercise (arousal) manipulation increased responsiveness to peripheral cues while decreasing audience responsiveness to message quality differences. This pattern, of course, is typically associated with superficial (i.e., peripheral) processing of message content. Although Sanbonmatsu and Kardes did not use an emotion manipulation, Gleicher and Petty (1992) varied moderate fear by warning students about either a new illness on campus or campus crime. They found that when peripheral cues were easily available to participants, these fear manipulations decreased the audience’s sensitivity to message quality differences, thereby again showing evidence of superficial message processing. Baron, Inman, Kao, and Logan (1992) reported a similar finding using a more naturalistic emotional manipulation.

In laboratory tests, it is hard to provoke strong levels of fear given that participants are free to terminate their participation and, moreover, are generally under the (correct) impression that strong stressors are unlikely to be used in modern psychological research. To develop a nonlaboratory alternative, Baron, Inman, et al. (1992) manipulated emotional arousal in a student dental clinic. In this setting, patients regularly received a complete description of their upcoming dental treatment. In this study, some dental patients were randomly assigned to receive this graphic description just minutes before hearing a message. These “high-fear” patients were more persuaded by a purposely flawed message (accompanied by an enthusiastic audience response) than were patients who did not receive this stressful description until after reacting to the message. Thus, fear led to less careful message processing in this study.

Moreover, fear is not the only emotion to provoke “low-effort” message processing. Bodenhausen et al. (1994) found that an anger manipulation heightened an audience’s responsiveness to peripheral message cues. In addition, this effect did not occur when a nonarousing emotion (sadness) was manipulated. In short, a number of studies indicate that arousing emotions tend to decrease the effort people employ when processing persuasive content. Given the strong emotional arousal frequently present during the typical instance of intense indoctrination, these results imply that indoctrinees are not likely to engage in very careful processing of the persuasive manipulations they are exposed to. Moreover, any resulting attitude change may be enhanced if the message or doctrine suggests a means to control the threat or danger used to trigger the emotion. This is particularly true in cases in which the persuasive messages used by the indoctrinating group suggest simple (i.e., attainable) avenues of threat reduction. As Rogers (1975) pointed out, under these conditions fear-based persuasive appeals are particularly effective at inducing attitude change.[3] Such appeals are indeed common in charismatic groups in which adherence to doctrine and loyalty to the group are proffered as means of avoiding the various threats and dangers made salient (e.g., Sargant, 1957; Singer, 1995).

One objection to this discussion of overload and superficial message processing is that, in theory, attitudes formed on the basis of superficial processing are less stable and less resistant to counterpropaganda than are attitudes formed or changed through more effortful processing (Petty and Cacioppo, 1986), Although there is emerging laboratory support for these predictions (Eagly & Chaiken, 1993), these laboratory studies do not reproduce the situation created by intense indoctrination in which the target of persuasion remains in a controlled social setting after message exposure, surrounded by others who confirm and reinforce the beliefs in question and in which social pressure is used to elicit a series of behaviors congruent with the new beliefs. Moreover, in many indoctrination situations, the target of persuasion remains socially dependent, sleep deprived, or otherwise debilitated well into his or her indoctrination experience, thereby making a careful reevaluation of doctrines and beliefs extremely difficult. Under these circumstances, it is expected that attitudes and beliefs changed initially as a result of heuristic message processing will become solidified, more extreme over time, integrated into other aspects of self and, as a result, relatively impervious to change (see Maass & Clark, 1984, for very similar reasoning).

Arousal, capacity, and conformity. Almost all accounts of intense indoctrination acknowledge that conformity pressure is carefully applied in the typical case of intense indoctrination (e.g., Galanter, 1989; Lifton, 1961, Singer, 1995). More important, however, the procedures and states previously discussed are known to potentiate classic social influence effects. Social influence effects are known to be more powerful in times of confusion, ambiguity, and low personal confidence (e.g., Deutsch & Gerard, 1955; cf. Baron, Kerr, & Miller, 1992). The complex nature of group doctrine coupled with the debilitating procedures used in the early stages of indoctrination should produce just such confusion and low confidence. Conformity effects also are strengthened greatly if the individual is faced with a unanimous group consensus (Wilder & Allen, 1977). Indoctrinating groups go to some lengths to provide at least the illusion of such consensus by carefully orchestrating the social surroundings of each recruit and removing or isolating those recruits who express doubts. Moreover, this consensus is usually expressed with confidence and enthusiasm. Although few studies have examined the impact of confederate confidence in majority influence studies, Baron, Vandello, and Brunsman (1996) reported that high (manipulated) confederate confidence increased both the rate of conformity as well as the individuals’ confidence in their (incorrect) conforming views. These data complement Nemeth and Wachtler’s (1974) report that a manipulation of nonverbal confederate confidence increased social influence in a minority influence paradigm.

The fact that conformity effects are enhanced by manipulations such as judgment difficulty and (low) individual confidence (e.g., Deutsch & Gerard, 1955) is congruent with this focus on attentional capacity as a key process leading to successful indoctrination. Various theorists have suggested that individuals will be more likely to rely on social information when their individual capacity is challenged or overwhelmed by a judgmental task (e.g., Festinger, 1954). One extension of this logic is that a unified group consensus serves as a heuristic cue (i.e., “all those people can’t be wrong”) that is more likely to be relied on by individuals when their capacity is inadequate (or they perceive it to be inadequate) for meeting the demands of a judgmental task (Chaiken & Maheswaran, 1994, Eagly & Chaiken, 1993). As noted, procedural features of intense indoctrination create just such conditions.

In addition to the results discussed in the preceding paragraph, some data indicate that conformity effects are more pronounced when fear levels are high. Darley (1966) reported that conformity was increased when individuals were threatened with the prospect of electric shock. Other forms of stress and arousal appear to have similar effects. Kruglanski and Webster (1991) examined groups of Israeli scouts who were attempting to agree on a camp location. Kruglanski and Webster found greater rejection of (confederate) deviates when time pressure or aversive noise was present. Similarly, numerous studies testing terror management theory indicate that individuals exhibit less tolerance for nonnormative behavior from others following a manipulation increasing the salience of their own death (e.g., Florian & Mikulincer, 1997). Thus, a variety of manipulations that should impair attentional capacity (i.e., emotion, time pressure, task ambiguity) enhance the effects of conformity pressure. These results have obvious implications for social influence applied in the stressful milieu of intense indoctrination.

Another feature of intense indoctrination is that the decisions and judgments in question often are ones of substantial cost and importance for indoctrinees. On initial reflection, one may argue that such importance may weaken social influence effects. If the decision is crucial, will individuals feel enough involvement to buck group pressure and make up their minds for themselves? The answer to this question appears to be “no” in cases of even moderate ambiguity. Baron, Vandello, & Brunsman (1996) varied decision importance through a combination of financial inducements and ego involvement (all increasing the importance of judgment accuracy). They found that for a moderately difficult judgment task (i.e., having a 28% error rate), conforming to the inaccurate confederate norm increased as task importance increased (see Figure 1). Moreover, as noted previously, this conformity was accompanied by increases in confidence in those conditions in which the confederates acted highly confident in their opinions. In short, Baron, Vandello, et al.’s (1996) data suggested that the conditions generally found during intense indoctrination (i.e., ambiguous and important judgments made in the presence of unanimous and highly confident peers) can produce particularly high rates of confident conformity.

Of course, group factors on occasion can undermine indoctrination procedures (Schein, Hill, Williams, & Lubin, 1957; Schein et al., 1961). Despite common belief, the indoctrination of several thousand U.S. prisoners of war during the Korean War was generally unsuccessful at producing lasting, internalized attitude change, with only a handful (n = 21) of these individuals actually refusing repatriation (Myers, 1998). Most accounts (e.g., Schein et al., 1957) attribute the resistance of the American prisoners of war in Korea to mutual group support the troops managed to provide each other during indoctrination (e.g., crossing fingers during public confession). Such support, however, often can be eliminated simply by keeping recruits separate from each other in the early stages of indoctrination

Figure 1. The percentage of critical trials in which conformity occurred as a function of task difficulty and judgment importance. Note. From “The forgotten variable in conformity research: The impact of task importance on social influence,” by R. S. Baron, J. Vandello, and B. Brunsman, 1996, Journal of Personality and Social Psychology, 71, p. 919. Copyright 1996 by the American Psychological Association. Adapted with permission.

and surrounding them instead with dedicated members of the indoctrinating group.

Arousal, attentional capacity, and the stereotyping of outgroups. A key feature of most groups that use intense indoctrination is that the group doctrine derogates outgroups (i.e., nonmembers) as unworthy, inferior, or dangerous. In short, indoctrination involves the development and strengthening of stereotypical thinking about outgroups in the internalization and consolidation stages. Given that stereotypical thinking is assumed to be a superficial (i.e., effort-saving) form of cognition (e.g., Allport, 1954), the same logic discussed in the preceding section suggests that the emotional arousal present in intense indoctrination should heighten such stereotypical tendencies (cf. Baron, 1986; Wilder & Shapiro, 1988). That is, if intense indoctrination depletes attentional capacity through arousal, lack of sleep, and high rates of regimented activity, one should expect individuals to use more intellectual shortcuts in perception and interpersonal judgment during such indoctrination (Baron, 1986).

Since the late 1980s, a good deal of converging evidence supports this prediction at least regarding arousal. Baron and Moore (1987) found that arousal produced by a physical stress test (cycling) increased the self-referent memory effect (thought to reflect the influence of accessible self-schemas). Kim and Baron (1988) reported that cycling stress heightened the illusory correlation stereotyping effect. In addition, Paulhus, Martin, and Murphy (1992) found that white noise (a common arousal manipulation) heightened gender stereotyping (see also Paulhus & Lim, 1994; Wann & Branscombe, 1995).

Although these studies did not manipulate emotional arousal, several studies report similar effects with emotional manipulations. Baron, Inman, et al. (1992), in a follow-up study to Kim and Baron (1988), found that high levels of dental fear just before dental treatment exacerbated the illusory correlation phenomenon (see Figure 2). In a similar vein, Friedland, Keinan, and Tytiun (1998), in a study with Israeli flight cadets, found that illusory correlation effects were more pronounced during the more stressful periods of flight training, whereas Keinan, Friedland, and Evan-Haim (in press) found that Israeli airport security trainees agreed more with stereotypical statements (e.g., professors are absentminded) just before a critical examination.

In a third study by this Israeli group, Keinan, Friedland, and Arad (1991) found that stressed individuals used broader categories when grouping objects and stimuli. In addition, Wilder and Shapiro (1988) reported that both fear and embarrassment increased the extent to which individuals judged people on the basis of their group membership as opposed to their individual characteristics. Wilder and Shapiro (1989) reported very similar data when competition-induced anxiety was manipulated. Finally, several clinical studies indicate that phobic anxiety heightens illusory correlation effects linking the feared stimulus with aversive outcomes (e.g., Purdy & Mineka, 1997). In short, the relevant data consistently support the prediction that arousal and emotional excitement increase the tendency for individuals to process social information in a stereotypical manner. Thus, the emotional manipulations used in intense indoctrination appear to increase the likelihood that individuals will accept the stereotypical depictions of outgroups advocated by that indoctrination.

Other investigators report that manipulations of time pressure (Jamieson & Zanna, 1989; Kruglanski & Freund, 1983), circadian rhythm incongruity (Bodenhausen, 1990), and information overload (Pratto & Bargh, 1991) also exacerbate stereotypical judgments and attributions. Such results bolster one’s confidence that depleted attentional capacity resulting from such manipulations increases our tendency to employ cognitive simplifications such as stereotypes. When viewed in conjunction with the findings regarding the impact of arousal, these time pressure and overload results are quite congruent with the view that depletion of attentional capacity is an important contributing factor to recruits’ acceptance of outgroup stereotypes during many forms of intense indoctrination in that both attentional overload and emotional arousal are generally kept high in the initial stages of intense indoctrination.

Explaining Why Arousal Impairs Cognition

The findings outlined here regarding the impact of emotional arousal on message processing, conformity, and stereotyping raise basic questions about why superficial processing occurs with greater likelihood in emotional conditions. One view is that emotional arousal produces physiological reactions that diminish our actual capacity to process inputs. Thus, Walley and Weiden (1973) argued that strong sympathetic activation accompanying most arousing emotional provocations increases the strength of recurrent lateral inhibition in the cortex. This form of inhibition refers to cases in which excitation in one area of the cortex inhibits excitation in nearby areas, thereby producing short-term decrements in attentional capacity (cf. Eysenck, 1977; Sargant, 1957). This involuntary reaction should occur whenever there is vigorous activity in the sympathetic nervous system. A second physiological perspective stems from Gur et al.’s (1988) report that strong negative emotions reduce cerebral blood flow, thereby disrupting cortical activity (c.f. Bodenhausen, 1993). Whether reduced capacity is thought to be caused by reduced blood flow or reciprocal inhibition, such physiological reactions should be automatic and uncontrollable. An alternative view is that arousal impairs social processing because it triggers such things as appraisal, attribution, and coping (cf. Lazarus, 1981; Schachter, 1964). Given that such processes are likely to be at least partially under conscious control, this perspective suggests that stressful arousal reduces the capacity for tasks other than appraisal and coping because of our decisions and priorities regarding attention allocation (Ellis & Ashbrook, 1988; Kahneman, 1973). This attention allocation view (Baron, Inman, et al., 1972) is analogous to a motivational view of superficial processing under fear; that is, superficial processing occurs because we have decided that other cognitive functions (appraisal or coping) merit higher priority than message processing. This low motivation view is plausible in most of the message-processing studies discussed previously; the message topic has little to do with the source of the arousal or emotion. Bodenhausen (1993) labeled such manipulations as cases of incidental affect. In contrast, the physiological perspective is analogous to a low ability explanation; aroused people process less input because they have less overall capacity

The available data suggest support for each of these views. Consistent with the attention allocation view, Baron, Logan, Lilly, Inman and Brennan (1994) manipulated fear among dental patients, as did Baron, Inman, et al. (1992), but used a message that was relevant to the topic of dental hygiene (the benefits and dangers of fluoridated public water). This is analogous to what Bodenhausen (1993) referred to as a manipulation of integral affect. Here attention to the message offered patients information that may allow them to avoid additional future dental treatment. As a result, we would expect fearful patients to have greater interest and involvement in the topic than low-fear patients. In accordance with the attention allocation perspective, fearful patients in this study showed more evidence of central message processing (a stronger message quality effect) than low-fear patients (see Figure 3). If stress simply “makes you stupid (temporarily),” as argued by the physiological perspective, such results should not occur. Also relevant is a study by Gleicher and Petty (1992), who found that a manipulation of moderate fear lowered careful message processing provided that the message contained an early, prominent peripheral cue (a strong summary recommendation by the expert source). More interesting, however, when this cue was absent, the fearful participants processed the message as carefully as low-fear participants. In short, the study by Gleicher and Petty, as well as that of Baron et al. (1994), suggests that fearful individuals can be induced to process carefully under certain key conditions contrary to the physiological perspective.

It is premature, however, to completely discount the physiological perspective. First, both attention allocation and physiological mechanisms may influence information processing. For example, it is possible that a moderate level of emotional arousal only partially depletes attentional capacity. If so, under moderate arousal there may be enough residual capacity to process necessary tasks, but the individual may be more reluctant to do so because this processing would now require using a greater percentage of remaining available capacity. This should entail greater psychological effort and therefore may be resisted unless motivation is quite high or other cognitive shortcuts are unavailable. Gleicher and Petty’s (1992) results may reflect such a dynamic. Moderate fear led to a greater reliance on available peripheral cues (indicating a motivational reluctance), but when peripheral cues were absent, careful processing was possible.

However, if one assumes that emotion depletes capacity it follows that if emotion becomes extreme enough, it may so deplete total capacity that even well motivated processing may suffer. One recent study by Meijnders (1998) reported such data. Fear was created by varying the explicitness of information regarding the substantial dangers of global warming (e.g., starvation, flooding), and then the (nonstudent) participants reacted to a message relevant to that problem (regarding an energy-efficient light bulb). Meijinders found that moderate levels of (integral) fear increased participants’ sensitivity to message quality differences (replicating Baron et al., 1994) but that at very high levels of fear, such careful processing was not apparent (i.e., both high- and low-quality messages produced equivalent amounts of persuasion). These results suggest that even when people are motivated to process carefully, superficial message processing may occur if fear levels are intense enough. This may be even more likely in cases of intense indoctrination in which other sources of cognitive debilitation (in addition to fear) are commonly present. Clearly, this is a topic that requires more research attention.
Interactive Dynamics During Consolidation

Fear, arousal, and dissonance. Returning to the discussion of interactive dynamics, a common feature of intense indoctrination is that indoctrinees are enticed into making a variety of costly behavioral decisions regarding the group. For example, it is common to have indoctrinees make public statements that espouse group doctrine. Such statements are also often self-critical. Thus, for example, American prisoners of war in Korea were pressured to make written and then newsreel confessions of supposed misdeeds (Schein et al., 1961). These procedures, of course, correspond closely to the well-known cognitive dissonance induced-compliance procedure. In the same vein, dissonance mechanisms fit nicely with the frequent use of escalated commitment (e.g., Schwartz, 1970) as an indoctrination feature. Even more relevant to this discussion of interactive dynamics, however, are data indicating that dissonance mechanisms may be enhanced by the fear, arousal, and confusion that characterize indoctrination.

Dissonance theory assumes that people change their attitudes after expressing propaganda because they feel unpleasant arousal. In accord with this notion, several researchers have shown that irrelevant sources of arousal can heighten dissonance reactions. Thane Pittman (1975), for example, threatened some students with electric shock just before having them write an essay favoring raising tuition. In one condition, he had a confederate state that this essay writing was causing him to feel aroused. Presumably, individuals hearing this statement were more likely to interpret any fear-induced arousal they experienced as feelings of dissonance. As predicted, these frightened participants were more persuaded by their essay than were nonfrightened participants. Similarly, Cooper, Zanna, and Taves (1978) found that dissonance effects were enhanced by amphetamine-produced arousal, whereas Fazio and Martin (reported in Fazio & Cooper, 1983) found similar effects using exercise-induced arousal. In short, there is good reason to assume that the fear and arousal generated by intense indoctrination will often be rechanneled and experienced as dissonance.

Attentional capacity and perception of choice. One argument against applying dissonance and self-perception views to indoctrination is that the mechanisms described by these theories are presumably unlikely to affect attitudes if individuals feel they had little choice about their initial compliance. For instance, a person could make sense out of donating all of his or her worldly possessions to the People’s Temple by deciding that act was coerced by such forces as peer pressure, fear of humiliation, or threats to family members. However, in many settings in which coercion exists, that coercion may be ignored, deemphasized, or forgotten by the indoctrinee, particularly if his or her attentional capacity is impaired and coercive pressures are psychological rather than physical or material.

As most cognitive dissonance researchers are well aware, a variety of subtle social and situational forces can reliably elicit dissonant behavior from individuals while generally preserving the perception of free choice among most respondents. In these settings, it takes a particularly perceptive and attentive individual to detect the subtle forces that provoked the behavior. In accord with this view, Stalder and Baron (1998) reported that individuals who had sophisticated and complex attributional styles did not show standard dissonance-produced attitude change in an induced compliance (essay writing) study (see Figure 4). One extrapolation from the Stalder and Baron results is that when attentional capacity is impaired, the ability to avoid or reduce dissonance by making an (accurate) external attribution for one’s actions will be compromised. If so, when capacity is reduced, it is likely that even moderately coercive pressures may be overlooked, leaving the individual feeling personally responsible for actions that were, in truth, provoked by the manipulative aspects of the setting. This is particularly likely in settings in which the pressure results from psychological and social forces rather than material or physical forces and in which individuals are confused, aroused, or debilitated by various stressors. It is interesting to note that both Cooper et al. (1978) and Fazio and Martin (cited in Fazio & Cooper, 1983) reported data that support these contentions. In these induced compliance dissonance studies, individuals showed evidence of behavior-consistent attitude change even under low-choice conditions provided that they had been aroused by either drugs or exercise. One explanation for such effects is that the arousal in these studies diminished the ability of low-choice individuals to correctly attribute their behavior to situational causes, thereby leaving them with the need to justify their attitude-discrepant behavior.

Existential pressure and identity change. A number of the indoctrination procedures discussed here heighten guilt, lower self-esteem, and generally create confusion about appropriate and viable self-conceptions. It is likely that these states will facilitate changes in identity. A person undergoing intense indoctrination is likely to confront aspects of his or her actions and values that are embarrassing or inconsistent with his or her idealizations about the self (especially given the frequent occurrence of self-criticism and confession). Indeed, recent research indicates that under certain conditions, individuals can be led to confess and to remember reprehensible behaviors that, in fact, they never committed (Kassin, 1997). In short, during intense indoctrination, self-conception should be relatively labile because of these alterations in self-esteem. As a result, changes in identity (e.g., from student to revolutionary or from businessman to social-equality pioneer) and corresponding alterations in values or their importance are therefore more likely (Ofshe & Singer, 1986). Such changes seem particularly likely when the indoctrinating group offers recruits a messianic cause. As social critic Eric Hoffer (1951) pointed out, by attaching themselves to a transcendent cause, a true believer can feel special, dedicated, and selfless, thereby remedying the very feelings of insignificance and low self-esteem that are elevated in the earlier stages of the indoctrination process. This represents an important and meaningful set of psychological benefits especially to those who were wrestling with such existential concerns even before indoctrination (see Zimbardo, 1997). In accord with this view, Galanter (1989) pointed out that those who deviate from cult norms often report increases in neurotic anxiety (see Pool et al., 1998, for experimental verification of this observation). Indeed, these feelings of purpose and certainty are aspects of cult experience that are missed most frequently by ex-cult members (Hassan, 1988; Singer, 1995).

Figure 4. Attitude change scores as a function of choice and attributional complexity: High attributional complexity eliminates standard induced compliance dissonance effects. Note. From “Attributional complexity as a moderator of dissonance-produced attitude change,” by D. R. Stalder and R. S. Baron, 1998, Journal of Personality and Social Psychology, 75, p.453. Copyright 1998 by the American Psychological Association. Adapted with permission.

In coercive settings, these processes of identity change are also likely to be supplemented by such processes as identification with the aggressor and aspects of the Stockholm Syndrome. Apparently, there is a tendency to rely on and identify with authority figures in times of stress. Whether this is a regression to childhood learning or is attributable to other mechanisms, it has been noted by various observers (e.g., Bettelheim, 1953; Hinkle & Wolff, 1956). Thus, it appears that fear, confusion, and other forms of stress make it easy for strong authority figures to become attractive role models and powerful forces of influence.
Personal Characteristics and Indoctrination Effectiveness

In addition to the causal dynamics previously described, a number of individual characteristics are likely to increase susceptibility to indoctrination. These include religious and political values that are compatible with the indoctrinating organization’s goals, having a tentative sense of self (Galanter, 1989), or finding oneself at points of life transition (e.g., divorce; Singer 1995). Although such individual differences undoubtedly contribute to susceptibility to indoctrination, it would be glib to assume that only certain types of individuals are susceptible to intensive indoctrination. Rather, the many instances of successful indoctrination among nonvoluntary, seemingly normal individuals suggest that only rare individuals will be unaffected by a full program of intense indoctrination.

Summarizing the Role of Attentional Capacity

As documented by this review, a good number of the findings generated since the mid-1980s support many of the interactive patterns predicted from the attentional capacity perspective. That is, it is apparent that the stress and attentional load so common in early stages of intense indoctrination heighten one’s susceptibility to a variety of social psychological processes commonly mentioned in discussions of intense indoctrination (e.g., Pratkanis & Aronson, 1992; Singer, 1995). Thus, capacity-related factors (time pressure, distraction, noise, task difficulty, individual difference variables) have been found to affect compliance (Dolinski & Nawrat, 1998), type of message processing (e.g., Baron et al., 1973), conformity (e.g., Deutsch & Gerard, 1955), stereotyping (e.g., Paulhus et al., 1992; Pratto & Bargh, 1991), and cognitive dissonance effects (Stalder & Baron, 1998). In light of the common occurrence of debilitating procedures during intense indoctrination, these results suggest that impaired attentional capacity is a key component of such indoctrination. Moreover, the available research findings support the view that the emotional arousal common to intense indoctrination apparently contributes to this diminution of capacity. In accord with this conclusion, the research indicates that both exercise-arousal manipulations and emotional manipulations affect message processing (e.g., Gleicher & Petty, 1992; Sanbonmatsu & Kardes, 1988), conformity (e.g., Darley, 1966), and stereotyping (e.g., Baron, Inman, et al., 1992; Kim & Baron, 1988) that are quite similar to those produced by other direct manipulations of cognitive capacity (e.g., Bodenhausen, 1990; Deutsch & Gerard, 1955).

According to this model, this lowered capacity leaves individuals more susceptible to poorly supported arguments, social pressure, and the temptation to derogate nongroup members. As such, it appears to play a key role in what I have described as the internalization stage of intense indoctrination. This reduction in capacity also may heighten dissonance effects in that low levels of attributional complexity recently have been linked to stronger dissonance-produced attitude change in the induced compliance paradigm (Stalder & Baron, 1998).

In addition to this capacity mechanism regarding dissonance, data also show that arousal per se enhances dissonance effects provided that it is interpreted as a consequence of one’s counterattitudinal behavior (e.g., Pittman, 1975).[4] As a result, there are grounds to expect particularly powerful dissonance phenomena during intense indoctrination. As noted, these dissonance effects can contribute to initial attitude change in the internalization stage (as the recruit attempts to justify early acts of compliance) as well as solidification of attitude in the consolidation stage (as the recruit justifies the more extreme escalating commitments that occur over time).

What I have deemed the cognitive miser view of consolidation suggests yet another capacity-related mechanism relevant to the consolidation stage. More specifically, high attentional load and fatigue should exacerbate our tendency to conserve our limited attentional resources (Kahneman, 1973). Thus, the taxing and stressful aspects of intense indoctrination are likely to discourage prolonged and recurrent examination of established group doctrine or emerging group policy, encouraging instead categorical acceptance of that doctrine. In short, considering the impact of intense indoctrination on attentional capacity provides a number of interesting insights regarding why individuals can be seduced by marginal arguments that seemingly violate their own self-interest and eventually reach the point at which they are relatively impervious (Hassan, 1988) to attempts to dissuade them from their newly acquired beliefs.

This is not to say that other internal states are irrelevant to the attitude and identity change that so often occur following intense indoctrination. States such as guilt, disassociation, and low self-esteem have been identified as likely to facilitate such outcomes as compliance, stereotyping, conformity, message agreement, identity change, deference to authority, and dissonance-produced attitude change (e.g., Cialdini, 1993; Lifton, 1961; Long & Spears, 1998; McGuire, 1968; Steele, Spencer, & Lynch, 1993). Unfortunately, there is not a great deal of experimental data carefully examining the impact of guilt and disassociation on these particular outcome behaviors, whereas the data regarding self-esteem are complex (Abrams & Hogg, 1988; McGuire, 1968; Rhodes & Wood, 1992) and, in some cases, contradictory (Gerard, Blevans, & Malcom, 1964; Steele et al., 1993; see also Long & Spears, 1998). One solution here may be to separately consider the findings for chronic as opposed to manipulated self-esteem in that one’s chronic level of esteem may not be momentarily salient when one is exposed to acute manipulations of self-esteem. In accord with this reasoning, there is reasonable support for the view that low manipulated self-esteem leads to stronger conformity effects and outgroup derogation (for reviews, see Long & Spears, 1998, McGuire, 1968). This is quite relevant to this discussion in that many of the procedures of intense indoctrination (e.g., self-criticism) appear to challenge if not directly assault one’s feelings of esteem and adequacy. Thus, there are some grounds to suspect that acute feelings of low self-esteem contribute to certain key aspects of intense indoctrination. Interestingly, such low esteem often will lead individuals to underestimate their cognitive capacity and so may also add a (low) motivational component to the depleted capacity actually created during intense indoctrination.

Concluding Comments

A systematic scrutiny of various examples of intense indoctrination reveals a variety of procedural events and internal states that lower one’s ability and motivation to carefully process social information. These capacity-related effects have been hypothesized to heighten a variety of phenomena including stereotyping, conformity, superficial processing of message content, and distorted conclusions regarding one’s own actions (e.g., Baron, 1986; Bodenhausen et al., 1994). These hypotheses reflect the growing awareness that a variety of well-established social psychological effects can be characterized as cognitive shortcuts that are more likely to be relied on when attentional capacity is challenged. As such, these effects can be viewed as specific instances of our general tendency to process information as economically as possible. From this perspective, the procedural features described in this article are important not just because they are often present during intense indoctrination but because they heighten social psychological mechanisms of internalization and consolidation to the point that, in concert, they produce a form of social persuasion that is qualitatively different form normal instances of social influence. Much of the research I have reviewed here is congruent with this view in that the research explicitly links key internal states or direct manipulations of attentional capacity with exacerbation of many of the social psychological phenomena well known to affect persuasion, stereotyping, and social influence. This research not only provides greater insight regarding the psychological dynamics that contribute to the frequent, long-term success of intense indoctrination, but it also furthers our understanding of processes that underlie a variety of key phenomena in social psychology.

Abrams, D., & Hogg, M. A. (1988). Comments on the motivational status of self-esteem in social identity and intergroup discrimination. European Journal of Social Psychology, 18, 317-334.

Allport, G. W. (1954). The nature of prejudice. Reading, MA: Addison Wesley.

Baron, R. S. (1986). Distraction conflict theory: Progress and problems. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol.19, pp. 1-40). New York: Academic.

Baron, R. S., Baron, P. H., & Miller, N. (1973). The relation between distraction and persuasion. Psychological Bulletin, 80, 310-323.

Baron, R. S., Hoppe, S., Linneweh, B., & Rogers, D. (1996). Social corroboration and opinion extremity. Journal of Experimental Social Psychology, 32, 537-560.

Baron, R. S., Inman, M., Kao, C. F., & Logan, H. (1992). Negative emotion and superficial social Processing. Motivation and Emotion, 16, 323-346.

Baron, R. S., Kerr, N., & Miller, N. (1992). Group process, group decision, group action. Buckingham, England: Open University Press.

Baron, R. S., Logan, H., Lilly, J., Inman, M., & Brennan, M. (1994). Negative emotion and message processing. Journal of Experimental Social Psychology, 30, 181-201.

Baron, R. S., & Moore, D. L. (1987). The impact of exercise induced arousal on social cognition. Social Cognition, 5, 166-177.

Baron, R. S., VanDello, J., & Brunsman, B. (1996). The forgotten variable in conformity research: The impact of task importance on social influence. Journal of Personality and Social Psychology, 71, 915-927.

Bettelhiem, B. (1953). Individual and mass behavior in extreme situations. Journal of Abnormal and Social Psychology, 38, 417-452.

Bodenhausen, G. V. (1990). Stereotypes as judgmental heuristics: Evidence of circadian variations in discrimination. Psychological Science, 5, 319-322.

Bodenhausen, G.V. (1993). Emotions, arousal and stereotypic judgements: A heuristic model of affect and stereotyping. In D. M. Mackie & D. L. Hamilton (Eds.), Affect, cognition and stereotyping (pp.13-37). New York: Academic.

Bodenhausen, G. V., Sheppard, L. A., & Kramer, G. P. (1994). Negative affect and social judgment: The differential impact of anger and sadness. European Journal of Social Psychology, 24, 45-62.

Chaiken, S. (1987). The heuristic model of persuasion. In M. P. Zanna, J. M. Olson, & C. P. Herman (Eds.), The Ontario Symposium (Vol. 5, pp.3-39). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Chaiken, S., Liberman, A., & Eagly, A. H. (1989). Heuristic and systematic processing within and beyond the persuasion context. In J. S. Uleman & J. A. Bargh (Eds.), Unintended thought (pp. 212-252). New York: Guilford.

Chaiken, S., & Maheswaran, D. (1994). Heuristic processing can bias systematic processing: Effects of source credibility, argument ambiguity, and task importance on attitude judgment. Journal of Personality & Social Psychology, 66, 460-473.

Cialdini, R. B. (1993). Influence: Science and practice (3rd ed.). New York: HarperCollins.

Cohen, S. (1978). Environmental load and the allocation of attention. In A. Baum, Singer, J., & Valins, S. (Eds.), Advances in environmental psychology: Vol. 1. The urban environment (pp. 1-30). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Cooper, J., Zanna, M. P., & Taves, P. A. (1978). Arousal as a necessary condition for attitude change following induced compliance. Journal of Personality and Social Psychology, 38, 1101-1106.

Darley, J. M. (1966). Fear and social comparison as determinants of conformity behavior. Journal of Personality and Social Psychology, 4, 73-78.

David, B., & Turner, J. C. (1996). Studies in self-categorization and minority conversion: Is being a member of the outgroup an advantage? British Journal of Social Psychology. 35, 179-199.

Deutsch, M., & Gerard, H. B. (1955). A study of normative and informational social influence upon individual judgment. Journal of Abnormal and Social Psychology, 51, 629-636.

Dolinski, D., & Nawrat, R. (1998). “Fear-then-relief” procedure for producing compliance: Beware when the danger is over. Journal of Experimental Social Psychology, 34, 27-50.

Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes. New York: Harcourt, Brace, Jovanovich.

Ellis, H. C., & Ashbrook, P. W. (1988). Resource allocation model of the effects of depressed mood states on memory. In K. Fiedler & J. Forgas (Eds.), Affect, cognition and social behavior (pp. 25-43). Toronto: Hogrefe.

Eysenck, M. W. (1977). Human memory: theory, research, and individual differences. Elmsford, NY: Pergamon.

Fazio, R. H., & Cooper, J. (1983). Arousal in the dissonance process. In J. T. Cacioppo & R. E. Petty (Eds.), Social psychophysiology: A sourcebook. (pp. 122-152). New York: Guilford.

Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7, 117-140.

Festinger, L., Reicken, H., & Schachter, S. (1956). When prophesy fails. Minneapolis: University of Minnesota Press.

Florian, V., & Mikulincer, M. (1997). Fear of death and judgement of social transgressions: A multidimensional test of terror management theory. Journal of Personality and Social Psychology, 24, 1104-1112.

Friedland, N., Keinan, G., & Tytiun, T. (1998). The effect of psychological stress and tolerance of ambiguity on stereotypic attributions. Unpublished manuscript, Tel Aviv University, Israel.

Galanter, M. (1989). Cults: Faith, healing, and coercion. New York: Oxford University Press.

Gerard, H. B., Blevans, S. A., & Malcom, T. (1964). Self evaluation and the evaluation of choice alternatives. Journal of Personality, 32, 395-410.

Gleicher, F., & Petty, R. E. (1992). Expectations of reassurance influence the nature of fear-stimulated attitude change. Journal of Experimental Social Psychology, 28, 86-100.

Gur, R. C., Gur, R. E., Skolnick, B. E., Resnick, S. M., Silver, F. I., Chawluk, J., Muenz, L., Obrist, W. D., & Reivich, M. (1988). Effects of task difficulty on regional cerebral blood flow: Relationships with anxiety and performance. Psychophysiology, 24, 392-399.

Hardyck, J., & Braden, M. (1962). Prophesy fails again: A report of a failure to replicate. Journal of Abnormal and Social Psychology, 65, 136-141.

Hassan, S. (1988). Combating cult mind control. Rochester, VT: Park Street Press.

Hearst, P. (1982). Every secret thing. Garden City, NY: Doubleday.

Hinkle, L. E., & Wolff, H. G. (1956). Communist interrogation and indoctrination. Archives of Neurology and Psychiatry, 76, 115-174.

Hoffer, E. (1951). The true believer. New York: Mentor.

Isenberg, D. J. (1986). Group polarization: A critical review and meta-analysis. Journal of Personality and Social Psychology, 50, 1141-1151.

Jamieson, D. W., & Zanna, M. P. (1989). Need for structure in attitude formation and expression. In W.R. Pratkanis, S. J. Breckler, & A.G. Greenwald (Eds.), Attitude structure and function. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.

Janis, I. L. (1972). Victims of groupthink. Boston: Houghton Mifflin.

Johnson, B. T., & Eagly, A. H. (1989). The effects of involvement on persuasion: A metaanalysis. Psychological Bulletin, 106, 290-314.

Kahneman, D. (1973). Attention and effort. Englewood Cliffs, NJ: Prentice Hall.

Kassin, S. M. (1997). The psychology of confession evidence. American Psychologist, 52, 221-233.

Keinan, G., Friedland, N., & Arad, L. (1991). Chunking and integration: Effects of stress on the restructuring of information. Cognition and Emotion, 5, 133-145.

Keinan, G., Friedland, N., & Evan-Haim, G. (in press). The effect of stress and self-esteem on social stereotyping. Journal of Social and Clinical Psychology.

Kim, H. S., & Baron, R. S. (1988). Exercise and the illusory correlation. Journal of Experimental Social Psychology, 24, 366-380.

Kruglanski, A. W. (1989). Lay epistemics and human knowledge: Cognitive and motivational bases. New York: Plenum.

Kruglanski, A. W., & Freund, T. (1983). The freezing and unfreezing of lay-inferences: Effects on impressional primacy, ethnic stereotyping and numerical anchoring. Journal of Experimental Social Psychology, 19, 448-468.

Kruglanski, A. W., & Webster, D. M. (1991). Group members’ reactions to opinion deviates and conformists at varying degrees of proximity to decision deadline and environmental noise. Journal of Personality and Social Psychology, 61, 212-225.

Lazarus, R. S. (1981). The stress and the coping paradigm. In C. Eisdorfer, D. Cohen, A. Kleinman, & P. Maxim (Eds.), Models for clinical psychopathology (pp. 177-183, 192-201). NY: Spectrum.

Lifton, R. J. (1961). Thought reform and the psychology of totalism: A study of Brainwashing in China. New York: Holt.

Long, K. M., & Spears, R. (1998). Opposing effects of personal and collective self-esteem on interpersonal and intergroup comparisons. European Journal of Social Psychology, 28, 913-930.

Luus, C. A., & Wells, G. L. (1994) The malleability of eyewitness confidence: Co-witness and perseverance effects. Journal of Applied Psychology, 79, 714-723.

Maass, A., & Clark, R. D. (1984). Hidden impact of minorities: Fifteen years of minority influence research. Psychological Bulletin, 95, 429-450.

Mackie, D. M., & Cooper, J. (1984). Attitude polarization: Effects of group membership. Journal of Personality and Social Psychology, 45, 575-586.

McGuire, W. J. (1968). Personality and susceptibility to social influence. In E. F. Borgatta, & W. W. Lambert (Eds.), Handbook of personality theory and research (pp. 1130-1187). Chicago: Rand McNally.

Meijnders, A. (1998). Climate change and changing attitudes: Effect of negative emotion on information processing. Eindhoven, The Netherlands: Eindhoven University Press.

Moloney, J. C. (1955). Psychic self-abandon and extortion of confession. International Journal of Psychoanalysis, 36, 53-60.

Myers, D. (1998). Psychology. New York: Worth.

Nemeth, C., & Wachtler, J. (1974). Consistency and modification of judgment. Journal of Experimental Social Psychology, 9, 65-79.

Ofshe, R., & Singer, M. T. (1986). Attacks on peripheral versus central elements of self and the impact of thought reforming techniques. Cultic Studies Journal, 3, 3-24.

Paulhus, D. L., & Lim, D. T. K. (1994). Arousal and evaluative extremity in social judgments: A dynamic complexity model. European Journal of Social Psychology, 24, 89-99.

Paulhus, D. L., Martin, C. L., & Murphy, G. K. (1992). Some effects of arousal on sex stereotyping. Personality and Social Psychology Bulletin, 18, 325-330.

Pavlov, I. (1927). Conditioned reflexes. Oxford, England: Clarendon.

Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In L. Berkowitz (Ed.), Advances in experimental social psychology, (Vol. 19, pp. 123-205). Orlando: Academic.

Pittman, T. S. (1975). Attribution of arousal as a mediator in dissonance reduction. Journal of Experimental Social Psychology, 11, 53-63.

Pool, G. J., Wood, W., & Leck, K. (1998). The self-esteem motive in social influence: Agreement with valued majorities and disagreements with derogated minorities. Journal of Personality and Social Psychology, 75, 967-975.

Pratkanis, A. R., & Aronson, E. (1992). Age of propaganda. New York: Freeman.

Pratto, F., & Bargh, J. A. (1991). Stereotyping based on apparently individuating information: Trait and global components of sex stereotypes under attention overload. Journal of Experimental Social Psychology, 27, 26-47.

Purdy, C. L. S., & Mineka, S. (1997). Covariation bias for blood-injury stimuli and aversive outcomes. Behaviour Research and Therapy, 35, 35-47.

Rhodes, N., & Wood, W. (1992). Self-esteem and intelligence affect influenceability: The mediating role of message reception. Psychological Bulletin, 111, 156-171.

Rogers, R. W. (1975). A protection motivation theory of fear appeals and attitude change. Journal of Psychology, 91, 93-114.

Sanbonmatsu, D. M., & Kardes, F. R. (1988). The effects of physiological arousal on information processing and persuasion. Journal of Consumer Research, 15, 379-385.

Sargant, W. (1957). Battle for the mind: How evangelists, psychiatrists, politicians, and medicine men can change your beliefs and behavior. Garden City, NY: Doubleday.

Schachter, S. (1964). The interaction of cognitive and physiological determinants of emotional state. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol.1, pp. 49-82). New York: Academic.

Schein, E. H., Hill, W. F., Williams, H. L., & Lubin, A. (1957). Distinguishing characteristics of collaborators and resisters among American prisoners of war. Journal of Abnormal Psychology, 55, 197-201.

Schein, E. H., Schneier, I., & Barker, C. H. (1961). Coercive persuasion: A socio-psychological analysis of the “brainwashing” of American civilian prisoners by the Chinese communists. New York: Norton.

Schwartz, S. H. (1970). Elicitation of moral obligation and self-sacrificing behavior: An experimental study of bone marrow donation. Journal of Personality and Social Psychology, 15, 283-293.

Singer, M. (1995). Cults in our midst. San Francisco: Jossey-Bass.

Singer, M. T. (1979, January). Coming out of the cults. Psychology Today, 10, 72-82.

Stalder, D. R., & Baron, R. S. (1998). Attributional complexity as a moderator of dissonance-produced attitude change. Journal of Personality & Social Psychology, 75, 449-455.

Stasser, G., Taylor, L. A., & Hanna, C. (1989). Information sampling in structured and unstructured discussions of three- and six-person groups. Journal of Personality and Social Psychology, 57, 67-78.

Steele, C. M., Spencer S. J., & Lynch, M. (1993). Self-image resilience and dissonance: The role of affirmational resources. Journal of Personality and Social Psychology, 64, 885-896.

Walley, R. E., & Weiden, T. D. (1973). Lateral inhibition and cognitive masking: A neuropsychological theory of attention. Psychological Review, 4, 284-302.

Wann, D. L., & Branscombe, N. R. (1995). Influence of level of identification with a group and physiological arousal on perceived intergroup complexity. British Journal of Social Psychology, 34, 223-235.

Wells, G. L., & Petty, R. E. (1980). The effects of overt head movements on persuaion: Compatibility and incompatibility of responses. Basic and Applied Social Psychology, 1, 219-230.

Wilder, D. A., & Allen, V. L. (1977). Social support, extreme social support and conformity. Representative Research in Social Psychology, 8, 33-41.

Wilder, D. A., & Shapiro, P. (1988). Effects of anxiety on impression formation in a group context: An anxiety-assimilation hypothesis. Journal of Experimental and Social Psychology, 25, 481-499.

Wilder, D. A., & Shapiro, P. N. (1989). Role of competition-induced anxiety in limiting the beneficial impact of positive behavior by an out-group member. Journal of Personality & Social Psychology, 56, 60-69.

Wood, W., Pool, G. J., Leck, K., & Purvis, D. (1996). Self-definition, defensive processing, and influence: The normative impact of majority and minority groups. Journal of Personality and Social Psychology, 71, 1181-1193.

Zimbardo, P. (1997). What messages are behind today’s cults? Washington, D.C.: American Psychological Association.


Earlier versions of this article were presented at the groups preconference at the meetings of the Society of Experimental Social Psychology, Toronto, Ontario, October 1997, and at the Utah Winter Conference in Social Psychology, Park City, Utah, January 1996.

I thank Joel Cooper, Russell Fazio, Tory Higgins, Arie Kruglanski, Fred Rhodewalt, David Sanbonmatsu, and Kip Williams for their comments.

This article first appeared in Personality and Social Psychology Review, 2000, Vol. 4, No. 3, 238-254. It is reprinted with permission. Slight editorial changes have been made at the author’s request.

Robert S. Baron, Ph.D. is Professor of Psychology at the University of Iowa. Requests for reprints should be sent to Robert S. Baron, Department of Psychology, E 11 SSH, University of Iowa, Iowa City, Iowa 52242-1407. E-mail:

[1] I thank Kevin Crawley and Diana Paulina of Unbound, Inc., Coralville, Iowa, for their cooperation and assistance in arranging interviews with over 40 of their clients who were recuperating from involvement in charismatic groups.

[2] There is some evidence that just appearing to agree with others may impact the nature of message processing. Wells and Petty (1980), under the guise of having people evaluate headphones, induced people to either nod their heads up and down (emulating a “yes” nod) or shake their heads from side to side (emulating a “no” shake) while hearing a message. The “nodders” exhibited more attitude change than did “shakers.” Thus, the polite agreement displayed by indoctrinees in the softening-up and compliance stages of intense indoctrination may actually render them more susceptible to the persuasive messages they are exposed to.

[3] Several distinctions can be drawn between the research on fear-inducing messages (e.g., Rogers, 1975) and the research in which messages are presented following an emotion manipulation (e.g., Baron, Inman, et al., 1992). First, the latter research focused specifically on whether message processing is careful or superficial, whereas the former research ignored this issue. Second, the research on “frightening messages” creates greater potential for defensive avoidance in the audience in that the message itself is the source of stress. Finally, in the research on frightening messages, the fear is always relevant to the topic of the message. In contrast, when emotion is manipulated prior to the message, the emotion can be either irrelevant or relevant to the message topic.

[4] I was tempted to interpret these dissonance-arousal studies as evidence for the view that depleted attentional capacity mediated the heightened attitude change observed in the aroused conditions. Contrary to this view, however, in these studies arousal only enhanced attitude change when attributional cues suggested a dissonance-based interpretation for the arousal (e.g., Pittman, 1975). As a result, these data are more consistent with the view that the arousal produced in these studies was a source of aversive motivation rather than a cause of capacity depletion. This conclusion, however, does not rule out the possibility that capacity depletion (when it occurs) facilitates dissonance-produced attitude change.