Cultic Studies Review Vol. 3, No. 2/3, 2004
Aberrations of Power: Leadership in Totalist Groups
Robert S. Baron, Ph.D., Kevin Crawley, Diana Paulina
In this chapter, several theoretical perspectives are used to analyze the leadership tactics utilized within three manipulative groups; the Peoples’ Temple, Synanon and the Children of God. This case study approach illustrates a number of common features of such leadership. These features include the means by which emotional and cognitive fatigue are used to amplify various categories of leader power, the manner in which assaults on self confidence and self esteem heighten dependence upon group leaders and the tendency of such leaders to initiate transformations of group purpose and group norms. Such transformation provides a means of testing member loyalty, evoking cognitive dissonance among followers and creating a sense of mystery and drama that serves to excite and intrigue group members.
Few phenomena provoke more interest in the concept of leadership than the power exerted by leaders in totalist groups. Whether we consider such cult behavior as mass marriages, mass suicides, or voluntary castrations, the power of such leaders to induce their members to ignore logic, self-interest and the entreaties of family members is undeniable. This chapter offers a number of suggestions regarding the basis of such power using three groups as highly typical examples of cult indoctrination. In such groups, members exhibit remarkable levels of obedience to authority following a period of systematic and intense indoctrination. It is this feature that marks the totalist group in our view (Baron, 2000). These case study descriptions will provide us with a vehicle for considering the factors that contribute to the unique nature of leadership in totalist groups.
Relevant Theories of Leadership
French and Raven’s Power Model
One defining aspect of leadership in totalist groups is the unilateral power of the group leader. Checks and balances of power are rarely present in totalist groups. Indeed, purges frequently remove anyone who might challenge the leader’s power (e.g., Davis, 1984; Ofshe, 1980). Thus, French and Raven’s (1959) model is quite relevant to our discussion given the multidimensional nature of leader power in such groups (Forsyth, 1999). Typically such leaders are adept at controlling most dimensions of power identified by this model. Public humiliation and corporal punishment (i.e., coercive power) is common. These leaders generally have high reward power, controlling most of the financial, social and sexual resources of the group (e.g., Reavis, 1998). The expert-informational power of such leaders is high due to such factors as education, status as seer-guru or media-based reputation. In the three groups we spotlight, informational and coercive power was amplified by encouraging mutual spying among group members (e.g. Layton, 1998). Thus, these leaders could appear omniscient regarding member actions and feelings; an “ability” Jim Jones liked to display by revealing embarrassing facts about unfortunate followers at group meetings.
Formal acknowledgement of totalist leaders, either by church ordination or media recognition, helps establish the legitimate power of these individuals. Finally, such leaders have a great deal of referent power. They are deeply admired by their followers who view them with profound reverence. One reflection of multidimensional and unilateral power enjoyed by totalist leaders is their ability to maintain group loyalty despite moving the group through various transformations. The Peoples Temple, originally a fundamentalist Christian Church, evolved into a socialist “movement” complete with an armed security force, a media specialist, etc. (Maaga, 1998). Synanon, initially a drug treatment program, emerged after several years as a religious movement (Gerstel, 1982).
A Social Identity Model of Leadership
The social identity view of leadership (e.g., Hogg, 2001: Hogg, Haines, & Mason, 1998) also offers a perspective that seems quite relevant to totalist groups. Hogg (2001) describes leadership that is conferred upon those individuals who most closely adhere to prototypic group norms. Such norms tend to be displaced away from outgroup positions in an attempt to emphasize ingroup similarity while at the same time maximizing how the ingroup differs from salient outgroups. This form of leadership emergence is thought to occur when group identification and group salience is high (Hogg, 2001). Totalist groups clearly qualify. In totalist groups, members draw on their affiliation as a primary source of self-definition (e.g., Hoffer, 1951). Moreover, in such groups, the salience of one’s group membership is kept extremely high by such means as physical separation, distinctive group dress, jargon, etc.
According to the social identity perspective (Hogg & van Knippenberg, in press), leaders emerge in such settings because “prototypic individuals” are viewed as models and are accorded high sociometric status thereby enhancing their persuasive power. These leaders are imbued by their followers with charismatic traits, due to the fundamental attributional error of seeing individuals as the causes of action. This last phenomenon is well represented in totalist groups in which leaders are literally deified by the membership. Hogg recognizes, however, that leaders who receive such adulation will often be accorded a status that marks them as “different.” If so, they will increasingly have to maintain control of the group by relying on coercive or reward power as they will forgo their status as in-group prototypes. Hogg (2001) suggests that this pattern of leadership is particularly likely to characterize cults in their later stages.
Theories emphasizing the charismatic features of leaders (see House & Shamir, 1993) are highly relevant to totalist groups. A modern example of this category, Bass’s theory of transformational leadership (e.g., Bass, 1998), combines a transactional perspective (focusing on exchange relationships between leaders and followers) with a charismatic approach. The transactional aspect of this model is that leaders receive power, and status in return for facilitating the goal attainment of followers. According to this model, if the leader also has certain charismatic traits, she/he should be particularly adept at convincing followers to work for common goals, while ignoring their own vested interests. Bass argues that such a leader is well suited to evoking change or “transformation” in groups, particularly change motivated by needs for self-actualization.
Some charismatic elements mentioned by Bass include: (1) intellectual stimulation (offering creative solutions and encouraging innovation in others), (2) individualized reaction to member needs and abilities, and (3) the ability to inspire and motivate followers. Effective transformational leaders should be innovative, should actively reward correct action (as opposed to punishing incorrect behavior), and most crucially, should offer the group some transcendent purpose, mission or messianic goal. Bass (1998), like Hogg (2001), recognizes that members of the group may exaggerate the positive features of such leaders by amplifying their perceived charisma. Effective transformational leaders, however, will, in fact, manifest the three charismatic elements listed above.
From a transactional (or exchange) perspective, totalist groups provide members with a sense of mission, purpose, belonging, and the benefit of not having to agonize over various life decisions (which are dictated by the group’s norms). The group in turn imbues the leader with exaggerated positive attributes, extreme power, as well as privilege, gladly tolerating the leader’s violation of particular aspects of group doctrine. Thus, a surprising number of totalist leaders have had erotic and material perks that were strictly forbidden to others (e.g., Davis, 1984; Kelley, 1995). Moreover, in accordance with this theoretical perspective, leaders in such groups invariably offer members a messianic purpose that most frequently requires members to override their immediate self-interest in lieu of some self-actualizing and transcendent common group goal (Baron, 2000; Hoffer, 1951).
From a charismatic perspective, leaders in totalist groups, typically are confident decision makers, accomplished speakers and imaginative at generating messianic group goals. As such, they generally qualify as “stimulating and inspiring” (e.g., Weightman, 1983; Ofshe, 1980). However, in several key respects, leaders in totalist groups deviate from the charismatic pattern that Bass considers to be defining features for transformational leadership. Bass assumes such leaders encourage group member innovation, do not prematurely criticize ideas that contradict group policy, act in an unselfish and morally correct fashion and use a contingent reward system that justly rewards individuals based on their efforts and investments. Such leaders also depict an optimistic future. In contrast, Bass describes pseudo-transformational leaders who while having many of the trappings of the transformational leader, cater to their own self-interest, rely on manipulation, fear, threat, and punishment to maintain control and seem to be governed by warped moral principles (Bass, 1998). It appears that leadership in totalist groups nicely illustrates this pattern of pseudo-transformational leadership. Thus, in such groups, leaders do not consistently use “contingent” rewards to compensate followers on a basis that is commensurate with their contributions, tend to punish transgressions more than they reward correct behavior, and do not rise above self-interest. The case studies we consider below explore these themes.
The Effects of Intense Indoctrination on Leadership in Totalist Groups
Several key factors elevate the power of leaders in such “high demand” groups. One such factor is that the identity-related benefits provided by group membership (Hogg, in press) are particularly gratifying for group members. As a result, members find themselves unusually dependent upon the group (i.e., the leader), for guidance, self-esteem and a sense of purpose, etc. A second factor involves the impact that totalist groups have on the attentional capacity of group members and how this in turn causes and maintains attitude change in such groups (Baron, 2000).
Numerous writers have noted that several social psychological processes appear crucial in triggering the attitude and value change that occurs in totalist groups. These include such processes as conformity dynamics, stereotyping of non-group members, group polarization, biased and incomplete message processing, and cognitive dissonance mechanisms that stem from inducing escalating commitments from group members (Pratkanis & Aronson, 2000, Schein, Schneier & Barker, 1961, Singer, 1995). Various writers have also commented on the stress and capacity-draining activity associated with group indoctrination and membership (e.g., Baron, 2000; Sargant, 1957). Sleep deprivation is extremely common in totalist groups with long work days and heavily regimented activity being an almost universal feature (Pratkanis & Aronson, 2000). Emotional arousal is frequently manipulated in such groups; such arousal is thought itself to deplete attentional capacity (Eysenck, 1977).
Baron (1986, 2000) has argued that such stress and capacity depletion heightens the various social psychological processes alluded to above (see also Bodenhousen, 1993). Several studies indicate that fear and arousal decrease the likelihood that individuals will notice and react to logical flaws in a persuasive message while elevating the tendency of individuals to be influenced by superficial aspects of a message such as the reactions of the audience, or the presumed credentials of the speaker (e.g., Baron, Inman, Kao & Logan, 1992; Sanbonmatus & Kardis, 1988). Similarly, fear has been shown to elevate stereotyping (Baron, Inman, Kao, & Logan, 1992; Keinan, Friedland, & Even-Haim, 2000), compliance (Dolinski & Nawrat, 1998), conformity (Darley, 1966), and cognitive dissonance generated attitude change (Pittman, 1975).
Given that life in totalist groups often entails emotional manipulations, sleep deprivation, high levels of regimented and required activity, and inadequate nutrition, there is ample reason to assume that attentional capacity is compromised in such groups, thereby heightening group members’ susceptibility to a variety of persuasion manipulations. It is likely that such stress and the attentional depletion it causes, will also affect leadership dynamics in such groups. This argument is based on several key assumptions. The first is that low capacity increases the power differential between members and leader regarding expert and informational power. If stress does lower attentional capacity (or even if it just decreases confidence in one’s capacity) one is rendered more dependent on a confident, informed, and well-credentialed leader to provide interpretations of events, as well as pre-packaged solutions and decisions.
The second assumption is that the stress of indoctrination lowers the self-confidence and self-efficacy of group members. Numerous reports support this view (e.g., Singer, 1995). Thus, a common indoctrination feature within totalist groups are recurrent instances where individuals must submit to detailed criticism, confession and other acts of mortification (e.g., Baron, 2000; Hinkle & Wolff, 1956). Moreover, members in totalist groups are often placed in the position of having to adjust to unusual group norms while trying to master an unfamiliar and complex doctrine or skill set, be it political, quasi-scientific, or religious. This confusion, coupled with criticism of the individual member from within the group, is sufficient to shake the confidence of all but the most self-resilient (Lifton, 1961). Such attacks on self-efficacy and self-esteem make identification with the group very attractive. It has been assumed for some time that group identification is a very effective means of bolstering self-esteem or of “escaping” from an inadequate self (Hoffer, 1951). It follows that the more inadequate one feels about oneself, the greater the allure of a totalist group.
A related idea is that these attacks on the member’s confidence and self-efficacy, render the member more dependent on the group (in cults this means the group leader) for guidance, interpretation, explanation and normative control over activity and choices. As Hogg (2001) points out, one benefit of committed group membership and identification is that group norms reduce uncertainty regarding what to think, feel and do. Hogg (in press) extends this logic by arguing that people who experience uncertainty regarding their self-concept should be particularly attracted to distinctive groups characterized by unique, clear norms that produce high group entitativity—a common feature of totalist groups. Thus, uncertainty is viewed as a factor that heightens normative power. Interestingly, normative control has been found to be particularly strong in cases in which group members’ task confidence is low (Bond & Smith, 1996) and the salience of group membership is very high (Abrams, Wetherall, Cochrane, Hogg, & Turner, 1990), as is the case in totalist groups. Indeed, recent research indicates that the more one conforms to the dictates of such highly salient, confident and self-referential groups, the better one feels about oneself (Pool, Wood, & Leck, 1998) and the more confident one feels about the decision (Baron, Van Dello, & Brunsman, 1996).
Thus, cult leaders have access to a double-edged sword. They use various techniques to assault the members’ individual sense of self-adequacy while at the same time offering the group’s messianic purpose as a means of transcending these feelings of doubt, meaninglessness, and low self-worth. In accord with this view, Galanter (1989) reported that established members of totalist groups report lower levels of neurotic distress than neophyte members. Thus, group identification offers members more than just the simple social benefits of affiliation and acceptance. It provides a means of alleviating anxiety, reducing decisional conflict and elevating feelings of uniqueness and self-worth. In summary, the low attentional capacity engendered by the features of life in many totalist groups not only heightens the impact of persuasive manipulations, but it elevates the power and allure of those in a position of leadership. This will be evident in the discussion we present below of leadership within three totalist groups.
The Peoples Temple
Jim Jones founded the Peoples Temple in Indianapolis in 1955. By 1965, he had moved the church with about seventy followers to the San Francisco Bay Area where most lived in a communal compound (Maaga, 1998). From its earliest days, Jones’s ministry combined elements of fundamentalist Christianity (e.g., faith healing) with progressive positions on racial and economic issues (Maaga, 1998). As a result, by 1967 Jones was a politically connected and well-known public figure in Bay Area politics. In California, Jones began to exert wide control over the personal decisions of those in his congregation. Monies and property were donated to the Temple, parents complied with Jones’ direction that they allow their children to be raised by other parishioners, and married couples discontinued living together if so ordered (e.g., Layton, 1998; Weightman, 1983). Church meetings often became forums for public criticism of parishioners which were punctuated with physical and psychological discipline (Layton, 1998). Jones regularly engaged in extramarital heterosexual and homosexual liaisons with group members. (Maaga, 1998; Layton, 1998).
In the period from 1976-77, Jones had almost all members of the Peoples Temple move to the jungle compound that was Jonestown, Guyana. In Guyana, Jones showed increasing evidence of paranoid ideation, depression, and bizarre behavior (Layton, 1998). Group members regularly worked twelve-hour days at arduous tasks on a protein deficient diet. Workdays were followed by prolonged group meetings after which loudspeakers would broadcast Jones’ harangues long into the night. The group’s commitment to “revolutionary suicide” was discussed and practiced in several all night sessions. This “practice” turned into reality when Jones ordered a group suicide shortly after his security personnel assassinated visiting Congressman Leo Ryan. Over 900 individuals perished. Audio tapes made that night indicated that there was high initial group commitment for this action (archives: npr.org).
Theoretical Analyses of the Peoples Temple
Social Identity Theory
In accord with the social identity view of leadership (Hogg, 2001) group salience in the Peoples Temple was quite high, as were levels of group identification. There is little doubt that Jones’ attitudes on everything from religion to socialism were admired, and almost by definition viewed as prototypically normative by the group. A key feature of Temple norms involved disparaging views of various out-groups (fascists, CIA, etc.). This all echoes social identity theory’s emphasis on ingroup members maximizing their differences from outgroups. Over time Jones’ redefined the social identity of the group. This entailed gradually changing which positions, attitudes and behaviors were deemed prototypical for group members -- a strategy identified by Hogg (2001) as a tactic used by leaders to maintain their position as prototypic individuals. This tactic is presumed to rely on the fact that by changing the group prototype, the leader assures that he more than anyone else continues to best embody this prototypic standard.
However, Jones was never viewed as a prototypic group member of the Peoples Temple by his followers. From the outset he was viewed as unique and superior. Nor did his leadership emerge because his attitudes and behaviors happened to coincide with the group’s protoyptic norms. Rather, Jones established what the norms would be by dint of his own opinions. In addition, while group members undoubtedly inflated Jones exceptional characteristics, his charisma was not just a function of this attributional bias. Rather it stemmed from exceptional skill at public speaking, identifying meaningful goals (e.g., racial equality), and his supreme confidence. It is true that Jones used increasing degrees of coercive control over time, but this seems primarily due to his psychological deterioration (Layton, 1998) rather than to his gradually being perceived as a “non-group” member.
Bass’s (1998) construct of pseudo-transformational leadership provides a better description of leadership within the Peoples Temple. Members certainly derived a number of the psychological benefits we mention above from their group affiliation while Jones certainly accrued multiple benefits as well. Thus, both Jones and his followers could be viewed as being in the type of exchange relationship emphasized by a transactional approach. Jones also captured a good many of the charismatic features stressed by Bass, such as inspirational leadership (messianic goals, dynamic style, etc.), individual consideration of group members' needs and abilities, and an innovative, self-actualizing agenda for group members. On the other hand, Jones was intolerant of dissent, reveled in public criticism of members, was highly manipulative and deceptive relating to members (healings were staged, clairvoyant abilities faked, etc.), and clearly emphasized his own self-interest and privileges when governing the group. In addition, his own moral sense appeared twisted and abnormal. Jones would seem to be a prime example of the pseudo-transformational leader. This form of leadership is thought to be both ineffective and a source of stress for group members. It is not hard to make this case when considering the history and sad demise of this group.
Synanon was a residential drug treatment group founded in California by Chuck Dederich, in 1958. Synanon’s treatment was based on the “Game,” a confrontational group session during which participants critically considered the defenses and illusions that sustained their substance abuse. Within months, Synanon claimed to be an effective means of controlling not only alcoholism but drug addiction, as well. Between 1958 and 1968, the group processed over 5000 individuals (Gerstel, 1982).
The purported success of this program as a treatment for drug addiction was based primarily on unsubstantiated reports in the press (Ofshe, 1980). While it is likely that members did remain drug free and sober while in residence (given the no-nonsense, confrontational Synanon approach), there is little formal documentation that Synanon provided a successful cure for individuals who moved to non-resident status (Ofshe, 1980). In time, such graduation ceased to be a goal of the organization. Its fame as a successful drug program led to donations, grants and expansion that permitted it to open businesses staffed by (unpaid) Synanon members (Gerstel, 1984). By 1967 Synanon had over 800 members in various residence facilities and had begun to admit non-addicted individuals from the community. The Game was offered as a powerful, albeit traumatic, means of self-exploration. Obviously it also served as a blunt instrument of punishment and control.
In 1968, Dederich formally re-conceptualized Synanon as a communal living experiment open to all. Entry required attending “boot camp” complete with sleep loss, vigorous exercise and other humiliating initiation activities. Game “marathons” lasting over 24 hours became common at this point. (Gerstel, 1984). By 1975 Dederich had declared Synanon to be a religion, renounced his vows of poverty, allocated himself a substantial salary, and established a luxury residence for himself and his entourage. Dederich could broadcast at will to all Synanon locations and used this communication system, “the wire,” to humiliate any Synanon member who displeased him (e.g., Gerstel, 1984).
Promiscuous sexual activity at Synanon had long been tolerated and by 1977, was actively encouraged as a means of establishing “mutual love” among group members (Gerstel, 1984). Dederich’s power was reflected in the effectiveness of this edict even among married members as well as his success encouraging abortion and vasectomies for group members (Ofshe, 1980). By the mid 1970’s, the group abandoned non-violence and formed armed security details designed to “protect” the group from outsiders as well as to discipline unruly members -- particularly resistant adolescents. By 1975, these security personnel had engaged in physical attacks on local neighbors, the beating of an ex-Synanon member, and a case in which an “enemy” attorney was bitten by a rattlesnake placed in his mailbox—a crime that eventually resulted in Dederich's accepting a plea bargain of five years probation (Gerstel, 1984). Synanon then lost a series of lawsuits stemming from the physical assaults made by the group. The IRS revoked their tax-exempt status in 1986. Synanon was formally disbanded in 1991, and in 1997 Chuck Dederich died of heart and lung failure in California where he was living in a trailer park (Yee, 1997).
Theoretical Analyses of Synanon
Social Identity Theory
In accord with a social identity view, group salience in Synanon was high given the residential nature of membership. Similarly, given the initiation ordeals, it is safe to assume that among those who chose to remain, group commitment and identification were very high (Baron, 2001; Pratkanis & Aronson, 2000). Moreover, the intense mortification process entailed in Gaming, recurrently encountered by both neophytes and veterans, was specifically designed to challenge members’ feelings of esteem and self-efficacy—conditions we have emphasized earlier as facilitating conditions for group identification. Given that Dederich lived among the other members, participated in Games on a weekly basis, and for years did not take obvious material advantage of his leadership position, one could argue that he was viewed as a prototypic group member. Thus, Dederich was deeply admired within the group and his opinions on a wide range of issues (from sexual promiscuity to the need for brutal mutual criticism) defined normative opinion and behavior within the group. In addition, Dederich took positions that differentiated him from those outside the group on a number of issues (e.g., private property, promiscuity). Thus, conforming to his “prototypic” opinions helped establish the distinction between Synanon members and those outside the group. In short, in several respects, Dederich’s leadership style corresponds to that outlined by social identity theory.
However, as above, Dederich did not emerge as a leader because his attitudes and behaviors happened to correspond to prototypic group norms. Rather, as creator of the group, such norms were defined by whichever opinions and actions he favored. This fact does not correspond to the analysis offered by (Hogg, 2001). In addition, while there is little doubt that Dederich’s leadership stemmed in part from his prototypic status as Hogg’s analysis maintains, it is clear that Dederich aura of charisma was to his abilities as a speaker, manager, and innovator, over and above any attributional bias on the part of his membership. On the other hand, Hogg’s suggestion that leaders come to rely more on coercive and reward power as they begin to distance themselves from the group is congruent with the fact that Dederich expanded his use of physical discipline as he adopted luxurious privileges not available to others. In short, the social identity perspective corresponds in some but not all respects to the leadership history within Synanon.
One can also make a reasonable case that Dederich’s leadership pattern represents the pseudo-transformational style alluded to by Bass (1998). Group members stood to gain any one of several transactional benefits, including a life free of drug addiction and crime (in the cases of drug addict members). Dederich offered inspirational leadership, a transcendent purpose, and individual consideration of group members. However, Dederich had little toleration of dissent, was an expert in humiliation and criticism of his followers, focused on punishing transgressions (as opposed to rewarding correct behavior), and was manipulative and Machiavellian in dealing with the group. Thus, Bass’s conception of pseudo-transformational leadership provides a close description of Dederich’s leadership style within Synanon.
The Children of God
David Berg founded the Children of God (COG) in 1967 in California employing an anti-establishment, fundamentalist Christian message to recruit young adults. Berg transformed this group in a matter of thirteen years from a fundamentalist sect to an international charismatic group that sanctioned promiscuous sexual behavior and religious prostitution. This activity funded a luxurious lifestyle for Berg and his inner circle (Davis, 1984; Charity Frauds Bureau Report, 1974). Berg was a man of voracious sexual appetites. In addition to three marriages and numerous affairs, he conducted incestuous relations with his children -- a fact verified in his writings, the "MO Letters" (Berg, 1976; Davis, 1984).
Local businesses and churches were originally called upon to "provision" the group as a means of combating drug use among the young. Many recruits did, in fact, give up sex, drugs, and alcohol to become involved with COG in its early years (Davis, 1984). Berg urged members to "forsake all" as a test of their faith (Berg, 1976). This commandment provided the basis for having members donate all their material possessions to Berg’s sect. The group gained nationwide media attention by 1969 after initiating prayer demonstrations at public events complete with biblical robes, wooden staffs, etc. (Davis, 1984).
By 1970 the group was located at a rural compound in Texas where recruitment techniques became systematized. The initial recruitment of an individual usually entailed sleep depriving the recruit with revolving indoctrination teams, and making certain that the recruit was never left alone nor given time to reflect quietly on issues. The recruit was continuously badgered regarding commitment to Jesus, the need to “forsake all,” etc. Once recruits signed a “revolutionary sheet” donating their goods to the group, they began a minimum of three month disciple training (Charity Frauds Bureau Report, 1974). This training involved heavily regimented 18 hour days with religious broadcasts frequently played all night. Each recruit was continually squired by a committed member (Davis, 1984). Each new recruit was given a new name to symbolize their spiritual rebirth. Following the “forsake all” doctrine, members were expected to break all ties with their old lives, especially friends and family -- with the exception of writing parents for funds. Time was spent in menial work and memorizing biblical verse, etc. All members were expected to keep diaries listing accomplishments, evil thoughts, etc. These items were revealed in group meetings where public confessions were encouraged (Davis, 1984, 1973). In less than two years, the group grew from 200 to almost 2,000 members. By 1974, the group had over 100 enclaves in various countries (Charity Frauds Bureau Report, 1974).
From 1970 until his death in 1994, Berg was rarely seen by his followers as he took up residence in various locations from Europe to the South Pacific. To manage the group and its activities, he began in the early 1970’s to communicate with the group via the "MO letters”; a series of diatribes in “bible-speak” that ranged in topic from direct prophecies from God to attacks on particular individuals. By mid-1971 the Parents and friends of group members formed Free-COG, an anti-cult organization. Such groups, provided the COG with necessary outgroups that could be vilified in MO letters. Defectors were threatened with harsh penalties. Berg preached that those who left the group would give birth to deformed children—a belief generally accepted within the group.
The evolution of the Children of God into a sex cult began in 1971. Originally the COG had a very puritanical position regarding sex (Davis, 1984). In mid-1971, Berg used a Biblical quote to argue that “all things were lawful” for any who were true and faithful Christians. Rank and file members learned of the sexual implications of this policy gradually so as to not shock them. Over time the formerly chaste and sexually segregated members of the group began to experiment with sex. By 1974, the sexual promiscuity in the group escalated to the practice of “flirty fishing,” i.e., using sexual behavior to recruit new members and to raise funds (Davis, 1984). Berg lived abroad until his death in 1994. The Children of God are still active and now refer to themselves as The Family (www.thefamily.org).
Theoretical Analyses of the Children of God:
Social Identity Theory
Both group salience and group identification within the Children of God were very high. The fact that members were generally accompanied by a buddy or partner when not in the group compound, the use of group jargon (e.g., flirting fishing, forsaking all, etc.) and the communal living arrangements of the group made group membership almost constantly salient. Given the sacrifices made by group members in terms of forsaking material goods and past relationships, members strongly identified with the COG and used it as a key source of self-identification. Berg’s beliefs as expressed in the "MO Letters" defined what was normative within the group. Berg seemed quite distinct from the rank and file membership. In Berg’s case this separateness stemmed from his age, his Ministerial status, and his “ability” as prophet. Indeed, after four or five years, he was rarely seen by members. As group founder, Berg’s leadership did not emerge because his views happened to coincide with prototypic group norms. Rather, Berg took pains to mold the views of his members, albeit gradually, to match his own so that they became prototypic via manipulation. He apparently did not come to rely more on coercive or reward-based power as he grew more distant and distinct from his membership. In this respect his behavior does not confirm the suggestions made by Hogg regarding such issues. However, the evolution of group doctrine from literal biblical interpretation to doomsday prophesy, and eventually to sexual adventurism, does represent the type of change in prototypic position alluded to by a social identity perspective as a means of maintaining power.
Berg’s leadership behavior provides numerous matches to Bass’s description of the pseudo-transformational style. In terms of transaction, membership in the COG provided young recruits with the option of rebellion with a purpose. Serious young Christians were provided with an opportunity for establishing, beyond a doubt, their commitment to Jesus. Confused and alienated teenagers were offered structure, discipline, a sense of importance and meaning, and a sense of acceptance and belonging (Davis, 1984). In terms of messianic elements, Berg offered innovative ideas and an inspiring message. However, in accord with the pseudo-transformational style, his reactions and behavior were not easily customized to reflect the individual desires and abilities of members, nor could he offer highly contingent reinforcement. He did not ignore his own self-interest, used reactive (punishment based) control, was highly intolerant of dissent or of member innovation, and hardly could be called a person whose moral sense was impeccable. Rather than an optimistic approach, he relied on fear based manipulations to redirect norms and behavior within his fiefdom. In short, there exists reasonable correspondence between Berg’s leadership style and the pseudo-transformational style described by Bass (1998).
Alteration of Goals and Policy as a Leadership Tactic in Totalist Groups: Group Life as Drama
One distinctive feature marking the groups we have discussed is that all three groups seemed to be in a state of evolutionary flux—a characteristic marking many totalist groups (Hoffer, 1951; Sargant, 1957). In such groups, leaders commonly change the group doctrine and even group definition. One possible interpretation of this change is that it represents a tactic used by leaders to remain “one step ahead” of the membership in terms of being a prototypic group member (Hogg, 2001). One fact that is congruent with this view is that these norms tend to change so that they define positions that heighten the distinction between the in-group and salient out-groups -- a key process according to a social identity perspective. This was certainly true in the three groups examined in this chapter, and also tends to be true in other totalist groups, as well (Kelly, 1995).
An equally plausible interpretation of such induced change is that it creates a sense of mystery regarding group doctrine. Such mystery would maintain the leader’s status as expert and necessary interpreter of that doctrine. A related reason leaders may encourage or generate such change is that it fosters feelings of excitement, growth and challenge thereby holding the interest of group members. This idea suggests that attraction to such groups is, in part, a function of the drama and excitement it provides for members. This “drama” interpretation has some similarity with the Transformational perspective in that such excitement would heighten the extent to which the leader was seen as an inspirational, and innovative leader, i.e., as a source of such drama and diversion.
However, theories of leadership and group process have heretofore not emphasized the notion that drama-based excitement is a benefit that often is provided by group life. In addition to serving this diversion function, a change in group doctrine provides leaders with a “loyalty test” that can be applied to the followers; by instituting change, the leader can discern who is committed enough to embrace whatever transformation of group purpose and group values is introduced. Such tests can be used to discern who should be rewarded, trusted, punished, banished, or manipulated. This assures that those remaining closest to power will be likely to comply with the leader’s interpretations and commands. Finally, inducing changes in doctrine, goals, etc. provides the leader with a means of eliciting a series of “escalating commitments” from followers. Repetitive, and costly personal transformations represent an effective means of creating cognitive dissonance among disciples thereby heightening members’ loyalty and commitment to the group and its leader (Baron, 2000; Pratkanis & Aronson, 2000).
Summary and Conclusions
We have considered three groups that have certain superficial differences but a number of disturbing commonalties. First, transformational change is a theme common to these three groups. A second common feature is that Bass’s description of pseudo-transformational leadership provides a reasonable fit to these three case histories, especially given the nature of the morally questionable, self-centered and manipulative charismatic style adopted by these leaders. Third, all three groups exposed members to stressful and attentional depleting procedures including overwork, sleep deprivation, regimentation, and various emotional manipulations. This is a common feature of indoctrination in most totalist groups (Baron, 2000). We feel this not only leads to inadequate and superficial scrutiny of group doctrine, but also heightens the members’ reliance on the leader as a source of decision making and interpretation. A fourth commonality is that these leaders showed little tolerance for opinion deviates or member innovation. Individuals who persisted in such behavior found themselves the object of humiliation, and/or physical discipline. As several writers have noted, the existence of such deviates does serve a function for the organization in that the group’s reaction marks the boundary of acceptable behavior and serves as an object lesson to other members regarding the consequences of norm violation (e.g., Hogg, 2001).
Fifth the three leaders availed themselves of various material and erotic privileges that separated them from the rank and file membership. While this does not invariably occur in totalist groups (The Heavens Gate group is one exception), it is a common pattern (Pratkanis & Aronson, 2000). This separation is enhanced by the adulation directed at such leaders. The result is that the leader and close associates occupy a higher caste than the membership. These facts, suggest that such leaders are not seen as just another group member, albeit prototypical. Hogg argues that such separateness eliminates the leader’s prototypic status thereby forcing her/him to utilize coercive power over time. This did tend to occur in the Peoples Temple, and to some extent in Synanon, as well. Note however, that according to most accounts, there was relatively little overt defiance to control in Jonestown (Layton, 1998). It seems that Jones’ use of coercive methods had more to due with his own mental deterioration than it did with the need to maintain control over the followers. It is important to note while we are considering this topic, that although David Berg set himself well apart from the followers in the Children of God Sect, he generally did not rely upon overtly coercive control tactics.
It would seem that leadership emergence, at least in the Peoples Temple and the Children of God, was not due to Jim Jones and David Berg happening to possess attitudes or traits that matched some prototypic standard. Instead, these leaders proactively specified for the group, who the out-groups would be and what in-group norms would consist of. Although the loyalty and sacrifice exhibited by group members seems attributable to their intense social identification with the group, leadership seems, in these two groups, to be based more on power dynamics and charismatic features than on the members’ admiration of individuals who happen to adhere most to prototypic norms. Our feeling is that this charismatic view of leader emergence will provide a good description and account of leadership in many totalist groups. Our reasoning here is that such groups are most frequently the “creations” of single innovative leaders who are able to recruit followers based on the allure of their style and message. As such, the leader does not “emerge” from an existing group of individuals on the basis of matching a prototypic standard. Rather, the group exists because of the leader’s charisma and his or her skill at recruitment. In such “boutique” groups, leadership is not “decided upon” but rather is presented as a fait accompli. As a result, we feel the social identity view of leadership does not provide a particularly good explanation for leadership emergence in the totalist groups with which we are familiar. The social identity view may have more application in cases in which leadership passes from an original leader to a second or third generation of leaders. In addition, the extent to which the leader matches the group prototype may play a crucial role in leadership maintenance in that such a match almost certainly contributes to the leader’s attraction and social power (Hogg, 2000).
We offer these observations with an obvious caveat. The case history descriptions we discuss above can not constitute strong verification for any view. Problems of restricted sample size, and selective sampling forces us to offer our discussion more as illustrations than as findings. We feel however, that given the extreme and costly behavior evoked in totalist groups, even a descriptive consideration of such groups is provocative and worthwhile.
Abrams, D., Wetherall, M. S., Cochrane, S., Hogg, M. A., & Turner, J. C. (1990). Knowing what to think by knowing who you are: Self-categorization and the nature of norm formation, conformity and group polarization. British Journal of Social Psychology, 29, 97-119.
Baron, R. S. (1986). Distraction-conflict theory: Progress and problems. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology, Vol. 19 (pp. 1-40). New York: Academic Press
Baron, R. S. (2000). Arousal, capacity and intense indoctrination. Personality & Social Psychology Review, 4, 238-254.
Baron, R. S., Inman, M., & Kao, C. F. & Logan, H. (1992). Negative emotion and superficial social Processing. Motivation and Emotion, 16, 323-346.
Baron, R. S., Van Dello, J., & Brunsman, B. (1996). The forgotten variable in conformity research: The impact of task importance on social influence. Journal of Personality and Social Psychology, 71, 915-927.
Bass, B. M. (1998) Transformational leadership: Industrial, military and educational impact. Mahway, New Jersey: Erlbaum.
Bodenhausen, G. V. (1993). Emotions, arousal and stereotypic judgments: A heuristic model of affect and stereotyping. In D. M. Mackie & D. L. Hamilton (Eds.), Affect, cognition and stereotyping (pp.13-37). New York: Academic Press.
Bond R. & Smith P. B. (1996). Culture and conformity: A meta-analysis of studies using the Asch’s line judgment task. Psychological Bulletin, 119, 111-137.
Charity Frauds Bureau (1974) Final Report of the Children of God to Honerable Louis J. Lefkowitz, Attorney General of the State of New York. Unpublished report: State of New York.
Darley, J. M. (1966). Fear and social comparison as determinants of conformity behavior. Journal of Personality and Social Psychology, 4, 73-78.
Davis, D. (1984). The Children of God: The inside story. Grand Rapids, MI: Zondervan Publishing.
Dolinski, D., & Nawrat, R. (1998). “Fear-then-relief” procedure for producing compliance: Beware when the danger is over. Journal of Experimental Social Psychology, 34, 27-50.
Eysenck, M. W. (1977). Human memory: Theory, research, and individual differences. Elmsford, NY: Pergamon Press.
Florian, V., & Mikulincer, M. (1997). Fear of death and judgments of social transgressions: A multidimensional test of terror management theory. Journal of Personality and Social Psychology, 24, 1104-1112.
Forsyth, D. (1999). Group dynamics (3d ed.). Belmont, CA: Wadsworth.
French, J. R. P., Jr. & Raven B. (1959). The bases of social power. In D. Cartwright (Ed.), Studies in social power. Ann Arbor, MI: Institute for Social Research.
Galanter, M. (1989). Cults: Faith, healing, and coercion. Oxford and New York: Oxford University Press.
Gerstel, D. (1982). Paradise incorporated: Synanon. Novato, CA: Presidio Press.
Hinkle, L. E., & Wolff, H. G. (1956). Communist interrogation and indoctrination. Archives of Neurology and Psychiatry, 76, 115-74.
Hoffer, E. (1951). The true believer. New York: Mentor.
Hogg, M. A. (2001) A social identity theory of leadership. Personality & Social Psychology Review, 5, 184-300.
Hogg, M. A. (in press). Uncertainty and extremism: Identification with high entitatitivity groups under conditions of uncertainty. In V. Yzerbyt, C. M. Judd, & O. Corneille (Eds.), The Psychology of group perception: Contributions to the study of homogeneity, entitativity and essentialism. Philadelphia, PA: Psychology Press.
Hogg, M. A., Hains, S. C. & Mason, I. (1998). Identification and leadership in small groups: Salience, frame of reference, and leader stereotypicality effects on leader evaluations. Journal of Personality & Social Psychology, 75, 1248-1263.
Hogg M. A. & van Knippenberg, D. (In press). Social identity and leadership processes in groups. In M. P. Zanna (Ed.), Advances in experimental social psychology. San Diego, CA: Academic Press.
House, R. J. & Shamir, B. (1993). Toward the integration of transformational, charismatic and visionary theories. In M. M. Chemers & R. Ayman (Eds.), Leadership theory and research: Perspectives and directions. San Diego: Academic Press.
Kelley, D. M. (1995). Waco: The massacre, the aftermath. First Things, May 1995.
Keinan, G., Friedland, N., & Even-Haim, G. (2000). The effect of stress and self-esteem on social stereotyping. Journal of Social and Clinical Psychology, 19 (2), 206-219.
Layton, D. (1998). Seductive poison. New York: Doubleday.
Leming, M. R., & Smith, T. C. (1974). The Children of God as a social movement. Journal of Voluntary Action Research, 3, 77-83.
Lifton, R. J. (1961). Thought reform and the psychology of totalism: A study of Brainwashing in China. New York: W. W. Norton.
Maaga, M. M. (1998). Hearing the voices of Jonestown. Syracuse, New York. Syracuse University Press.
National Public Radio: Father Cares. Weekly Edition, January 23, 1999.
Ofshe, R. (1980).The social development of the Synanon cult: The managerial strategy of organizational transformation. Sociological Analysis, 41, 109-127.
Pittman, T. S. (1975). Attribution of arousal as a mediator in dissonance reduction. Journal of Experimental Social Psychology, 11, 53-63.
Pratkanis, A. R., & Aronson, E. (2000). Age of Propaganda (2nd ed.). New York: Freeman.
Pool, G. J., Wood, W., & Leck, K. (1998). The self-esteem motive in social influence: Agreement with valued majorities and disagreement with derogated minorities. Journal of Personality & Social Psychology, 75, 967-975.
Reavis, D. J. (1998). The ashes of Waco: An investigation. Syracuse, New York: Syracuse University Press.
Sanbonmatsu, D. M., & Kardes, F. R. (1988). The effects of physiological arousal on information processing and persuasion. Journal of Consumer Research, 15, 379-385.
Sargant, W. (1957). Battle for the mind: How evangelists, psychiatrists, politicians, and medicine men can change your beliefs and behavior. Garden City, NY: Doubleday.
Schein, E. H., Schneier, I., & Barker, C. H. (1961). Coercive persuasion: A socio-psychological analysis of the “brainwashing” of American civilian prisoners by the Chinese communists. New York: W. W. Norton.
Singer, M., & Lalich, J. (1995). Cults in our midst. San Francisco: Jossey-Bass.
Weightman, J. M. (1984). Making sense of the Jonestown suicides. New York: Mellon.
Yee, M. (1997). Charles Dederich, founder of cult-like religious group Synanon, dies at 83. The Associated Press, March 5, 1997.
This article is reprinted with permission (print only) from Sage Publishing Ltd. The article originally appeared as Chapter 5, "Aberrations of power: Leadership in groups" in Leadership and power, edited by Daan van Knippenberg and Michael Hogg, Sage Publications, 2004.
Robert S. Baron, Ph.D., is Professor of Psychology at the University of Iowa. He has published widely on topics in group influence including papers on group polarization, conformity and indoctrination procedures. He (with Norbert Kerr) is the author of Group Process, Group Decision, Group Action. Requests for reprints and correspondence should go to Professor Robert Baron, E 11 Seashore Hall, University of Iowa, Iowa City, IA 52242
Kevin Crawley was from 1980 to 1990, Co-Director of Unbound, Inc, a residential counseling center for former members of totalist groups. He wrote "Reintegration of Exiting Cult Members with their Families: A Brief Intervention Model," with Diana Paulina and Ron White. He is currently Interactive Specialist with the City of Iowa City.
Diana Paulina was from 1980 to 1990, Co-Director of Unbound, Inc, a residential counseling center for former members of totalist groups. She has taught at the High School Level. She wrote "Reintegration of Exiting Cult Members with their Families: A Brief Intervention Model," with Kevin Crawley and Ron White
Cultic Studies Review Vol. 3, No. 2/3, 2004, Page