Articles‎ > ‎

Expanding the Groupthink Explanation


Cultic Studies Journal, 1995, Volume 12, Number 1, pages 49-71


Expanding the Groupthink Explanation to the Study of Contemporary Cults

Mark N. Wexler, Ph.D.

Simon Fraser University


Burnaby, British Columbia


Abstract


Janis’s groupthink model is the most frequently used model in studying group decision making. This paper critically reviews Janis’s model and seeks to evaluate its applicability to the study of decision making in cults. Janis’s model is found wanting. It fails to look at (1) how cult leaders, through the use of ordeals, draws a loyal, elite group of decision makers about them, (2) how the decision elite within a cult use a mechanism of social control based on guilt, fear, or shame to create deindividuation in cult members, (3) how the decision elite are imbued with the virtue of infallibility and how this is used to create enthusiastic conformity in cult members, and (4) how the wild premises and erratic decision making in the cult are facilitated by the awe in which the cult members hold the charismatic leader.

Cults and their leaders have often engaged in risky, low-quality, and even pathological decision-making processes. Students of destructive and psychologically manipulative cults need go no further than the decision by members of the Aum Shinri Kyo (Supreme Truth) cult to release poisonous gas in the Tokyo subway system, the mass suicide in Jonestown, or the decision by David Koresh and his followers to remain behind their barricaded compound in Waco, Texas, to realize that decision making within and by cults often departs from the canons of rationality. In psychology and the policy sciences, Janis (1971, 1972, 1982) has coined the “groupthink” concept to explain a particular kind of group pathology that he believes contributed to U.S. foreign policy fiascoes such as the Bay of Pigs invasion, the invasion of North Korea, and the escalation of the Vietnam War. Janis and those who followed in the study of decision pathologies within group contexts (Aldag & Fuller, 1993; Cline, 1990; Hart, 1990, 1991) have expanded and modified the initial groupthink explanation beyond the scrutiny of political policy analysts (Esser & Lindoerfer, 1989; Huseman & Driver, 1979; Moorhead, Ference, & Neck, 1991). As a result, the stage has been set for a model of how and why the decision-making process engaged in by cults and their members tends to be unreliable.

My purpose here is to clarify and expand the groupthink model in order to aid students of cults and cultlike processes in portraying the manner in which cults engage in faulty decision-making processes. To accomplish this, I will (1) review Janis’s original model of groupthink, (2) point out the difficulties in applying Janis’s model to decision making in cults, (3) modify the model, particularly the antecedents, to make it relevant to cultic studies, and (4) conclude with some suggestions for future research on groupthink in cultic studies.

Janis’s Groupthink Model

Groupthink refers to a restrictive mode of thinking pursued by a group that emphasizes consensus rather than a careful and realistic analysis of the decision alternatives. The fundamental problem underlying groupthink is the manner in which the group concedes to pressure to conform. Conformity in and of itself is not harmful; however, when it subverts the meaningful pursuit of issues and opinions relevant to the problem at hand, conformity can produce disastrous results. Janis calls this conformity concurrence seeking (see Fig. 1 at end of article - follow this link to see Fig. 1). “Groupthink” is “a mode of thinking that people engage in when they are deeply involved in a cohesive group, when the members’ striving for unanimity overrides their motivation to realistically appraise alternative courses of action.... Groupthink refers to a deterioration of mental efficiencies, reality testing, and moral judgment that results from in-group pressures” (Janis, 1982, p. 9).

As an example of the groupthink phenomenon, consider an historical instance used by Janis (1982): the Kennedy administration’s decision to invade Cuba at the Bay of Pigs in 1963. As a policy decision undertaken by a group, the Bay of Pigs fiasco ranks among the worst blunders ever enacted by an American administration (Wynden, 1979). In detailing this and other faulty group decisions, Janis singled out eight main symptoms of groupthink: the group possession of (1) an illusion of invulnerability, (2) a faulty grasp of its own moral principles, (3) the skills for intellectualization and rationalization, (4) advanced capabilities in stereotyping others, (5) a willingness to self-censor, (6) desire to act with unanimity, (7) the ability to put direct pressure on dissidents, and (8) reliance on self-appointed mind guards to maintain the belief system of the group. These eight symptoms were used by groups to sustain an esprit de corps during rough times but, as Janis was quick to point out, also facilitated a rapid decline in the group’s ability at reality testing. Consequently, decision groups that possess characteristics of groupthink are highly cohesive yet extremely prone to poor decision making.

The key to the problem at the root of Janis’s notion of groupthink is the high degree of group cohesiveness which leads to concurrence seeking or a selective and group-approved version of reality. Within Janis’s thinking, concurrence-seeking tendencies in groups are dependent on three antecedent conditions (see Fig. 1, at end of article). The primary condition necessary for concurrence thinking is the emergence of a highly cohesive group. Group cohesion is defined as the result of all forces that work to hold group members together (Deutsch & Krauss, 1965, p. 56). It is a measure of the members’ desire to remain together as a group. The group takes on an importance which in effect means that it can make demands on the individuals who comprise it.

Janis notes that “when group cohesiveness is high, all members express solidarity, mutual liking and positive feelings about attending meetings and carrying out the routine tasks of the group” (Janis, 1982, p. 4). Janis reasoned that groups with this “high cohesiveness” were more susceptible to pressures toward conformity. A critical consequence of high cohesiveness seems to be a reduction in group conflict as the direct result of the members’ desire not to rock the boat. Janis seemed to have intuited the growing importance of “teams” and “teamwork” in contemporary organizations (Guzzo & Salas, 1995; Katzenbach & Smith, 1993; Kinlaw, 1991). The very cohesiveness of these teams, while surely a virtue, may in the long term result in organizations housing groups (teams) with a propensity toward risky decisions with a low probability of successful outcomes.

The second antecedent condition (labeled B-1 in Fig. 1) relates to four structural characteristics or failings in the organization housing the decision-making group. “These structural features,” writes Janis, “can be regarded as moderating variables involving the presence or absence of organizational constraints that could counteract the concurrence-seeking tendency” (1982, p. 301). The existence of these structural flaws--group insularity, directive leadership, group homogeneity, and a lack of clear norms--detracts from the group’s ability and desire to realistically search for information critical to the decision at hand. All four work to isolate the group and provide it with a strong sense of its own importance and centrality. This insularity is aided by a highly directive or charismatic leader who fixes attention upon issues for the group as a whole. This attention is relatively easy to fix as the group either starts with a shared value system or ideology or it quickly evolves this homogeneity in beliefs and outlook. The homogeneity, however, substitutes for critical thinking and clear procedures for what to do when members feel threatened.

Because of these structural failings, the group with a propensity toward groupthink falls easy prey to what Janis terms “provocative situational contexts.” In Janis’s model, groupthink does not easily occur in routine situations involving simple or equivocal decisions. Rather, the chances of groupthink dramatically increase when cohesive groups (Fig. 1, A) with structural faults (Fig. 1, B-1) find themselves under stress from external sources in a crisis situation (Fig. 1, B-2). Crisis generates uncertainty and stress. Members of cohesive groups seek to utilize the group as a means of producing clarity (solutions) and reducing the anxiety associated with stressful situations. When highly cohesive, structurally flawed groups enter crisis situations, the members seek to reduce stress and uncertainty by blaming nongroup members or false group members. This turning blame outwards, or alternatively the beginning of witch hunts to purge the group of false members, creates a belief in the vulnerability of the group.

As the perception of group vulnerability increases, members begin to question themselves. The lowering of self-esteem which arises during the group crisis is owing not only to the members’ inability to master the problems they face, but also to the high value they place on their membership. Hart, in a recent analysis of Janis’s “classic” study, points out how “valuing the group higher than anything else.... causes them to strive for a quick and painless unanimity on the issues that the group has to confront” (Hart, 1991, p. 247). During crisis, this striving for a quick and unanimous form of decision making is achieved only when the group is willing to submit to the leader’s views. During what Janis terms “provocative situational contexts,” members of groups with a proclivity to groupthink suppress personal and moral doubts about the group’s chosen path and adhere closely to the group’s wishes. Members suppress their own views because of their diminished confidence in self and, in lieu, substitute the group’s views as the means of saving the day.

The symptoms or consequences of groupthink can be divided into three characteristics (see Fig. 1, C). First are those that produce an overestimation of the power of the group. This includes the group’s illusion of its own invulnerability and a belief in its own inherent morality. Despite the crisis or provocative situational context, the members affirm their membership and do so with the assertion that this is rational. After all, the group is invulnerable and it has the answers to impenetrable, morally complex issues.

Second, Janis addresses characteristics producing close-mindedness in group members. Foremost amongst these are collective rationalizations and firmly held, often stereotyped images of out-groups.

Third, Janis explicitly focuses on those characteristics producing compelling pressures toward uniformity. These include self-censorship, illusion of unanimity, direct pressures upon dissenters, and the existence of group-appointed mind guards.

The results of such a decision-making process are simple to discern: defective or wild decisions (see Fig. 1, D). The policy makers in Janis’s studies of real-life events--the group that prepared the policies of the U.S. Navy at Pearl Harbor in December 1941; the group composed of President Eisenhower and his advisors who made the decision to pursue the defeated North Korean Army on its home territory; President Kennedy and his advisors’ decision and plan to invade the Bay of Pigs; and President Johnson and his team’s decision to continue and escalate the Vietnam War--are provided as evidence of decision-making fiascoes caused by unbridled concurrence thinking. Close-minded, stereotyped, overconfident, and morally exempt decision makers are, Janis illustrated, highly unlikely to realistically deal with complex decision scenarios.

In accordance with Janis’s model (see Fig. 1, D), the symptoms of defective decision making by groups with a propensity toward concurrence seeking are couched in procedural terms. Janis views the six key defects as the group’s failure to (1) engage in a careful and complete survey of decision alternatives, (2) complete a survey of the group’s aims and objectives prior to making the decision, (3) critically highlight and reexamine the group’s preferred choice, (4) engage in a robust information search, (5) remain impartial and clear-headed in processing information, and (6) develop a contingency model in which modifications to the decision are made dependent on alternate scenarios. In the groupthink model, groups that make risky, poor-quality decisions do so not only because they are cohesive, structurally flawed, and in provocative or stressful situations, but also, because of these antecedents, group members fail to follow basic procedural logic for making high-quality decisions.

The groupthink model initially developed by Janis and reviewed in Figure 1 has not only gained widespread acceptance in academic studies, but the groupthink concept has become common in everyday parlance as well. Aldag and Fuller, in a review of the idea of groupthink for professional psychologists, write that “in recent years, acceptance of the groupthink phenomenon has become almost universal, and the term groupthink has entered the popular vocabulary” (1993, p. 533). To support their claim, Aldag and Fuller point out that the Social Sciences Citation Index showed a phenomenally high entry of more than 700 citations of Janis’s work for the period of January 1989 through June 1992. In a more popular vein, articles focusing primarily on the dangers of groupthink regularly appear in such outlets as Psychology Today (Janis, 1971), Nation’s Business (Cerami, 1972), and Christianity Today (Buss, 1993) and in such professional or applied outlets as the Canadian Medical Association Journal (Henderson, 1987), the Journal of Nursing Administration (Rosenblum, 1982), Supervision (Sanders, 1980), and the Journal of Business Ethics (Sims, 1992).

The broad diffusion of Janis’s model as the model that explains decision-making fiascoes by groups is caused by four factors. The model is plausibly grounded in real-life situations like The Bay of Pigs invasion or, more contemporaneously, the Irangate debacle endured by the Reagan administration (Hart, 1990). Second, it has been argued that the widespread acceptance of the groupthink model arises from its considerable heuristic value. Aldag and Fuller (1993) argue that Janis’s groupthink model, much like Maslow’s need hierarchy, provides a highly generalizable but precise and deterministic sequence of stages that can suit many circumstances.

A third reason for the extensive diffusion of Janis’s model is its counterintuitive insistence that group cohesiveness can and does lead to a particular kind of group pathology. Several social analysts (McCauley, 1989; Neck & Manz, 1994) make it clear that Janis’s model hits upon an ambivalence buried deep in the struggle between the ideologies of individualism harkening back to the frontier thesis and the newly emergent heralding of groups or teams as the salvation of a crumbling American competitiveness in industry. The view of groupthink as one of an undesirable phenomenon, its very name evocative of Orwellian “doublethink,” suggests that Americans have much to fear in losing their hard-won individualism in the face of what, in Janis’s hands, can turn out to be the tyranny of a cohesive and overly conformist majority.

The fourth and last reason for the extensive diffusion or success of the groupthink model is related to the third reason: the tension in the American psyche between critical forms of individualism and a longing for the productive efficiency and security of cohesive teams. Janis’s indisputable view that groupthink is an undesirable phenomenon is modified by Janis’s rather practical suggestion that there exist simple but strong and useful techniques to minimize the dysfunctional consequences of groupthink. The most pertinent among these are the following: the group leader should facilitate the group members’ airing of doubts and objections; leaders should be impartial and refrain from making their personal preferences explicit at the outset of the group’s inquiries; members should be encouraged to discuss the group’s deliberations with trusted associates outside the group and to report back their reactions; different outside experts should be brought in from time to time to challenge the views of members; there should be one or more devil’s advocates during group meetings; and lastly, a second-chance meeting should be held to reconsider the decision once it has been reached but before it is put into practice or made available to the public.

These suggestions place Janis’s model in the advantageous position of coming down favorably on both sides of the aforementioned ideological struggle. It is, to be sure, Janis’s genius that has positioned the groupthink model as both a group pathology and a solution or means of minimizing the pathology. In this positioning, Janis’s model overtly declares itself a champion of the alert critical individual, all the while touting the need for effective and cohesive teams to make important decisions. It is clear that Janis never recommends that nonroutine, complex decisions be made by the individual and not the group; his is a fear of conformity and therein the loss of a clear grasp of reality, not a romantic proclamation of unbridled or even enlightened despotism.

Difficulties in Expanding Groupthink to Cultic Studies

The success of Janis’s groupthink model, I am arguing, comes from its realism, generalizability, counterintuitiveness, and ideological mincing. At first blush, students of cults may be tempted to import Janis’s groupthink model intact. It is, after all, generalizable. It addresses erratic decision making in highly-insulated, overly-cohesive, charismatically-led, value-charged, secretive groups which often find themselves in provocative or stressful situations. But caution and, in my view, care are required before accepting Janis’s model of groupthink as adequate to the task of explaining decision-making processes in cults. There are four key assumptions in Janis’s model which I believe require thinking about when using the groupthink model, unchanged, to explain wild decision making done by and within cults (Galanter, 1989).

As Aldag and Fuller (1993) make clear, Janis makes two assumptions, both of which fit the policy advisorial groups he studied but do not apply equally well to groups generally. The first assumption is that the primary purpose or raison d’être of the group is problem solving, which at times Janis calls decision making. The second assumption is that in Janis’s groups it was clear who was supposed to make decisions. These assumptions do not hold well at all when we move from decision-making groups like political policy advisors to members of a cult.

Cults do not exist primarily to make decisions. The fact that decisions are made by cults seems incidental. This issue is very vital--for Janis’s logic, when driven to its essence, is that groups whose primary aim is to make high-quality decisions fail to do so when the group becomes overly preoccupied with concurrence-seeking behaviors. However, it is not clear that groups whose primary aim may be the socialization of individuals to new belief systems fail to accomplish their primary aim when they engage in concurrence seeking. Indeed, one can argue that the wild and erratic decisions made by cults is an unintended consequence of the cults’ effort to pursue concurrence seeking as a means of socializing members to a new belief system.

The second assumption inherent in Janis’s groupthink model which is not easily transferable to the study of cults is that we all know who ought to make decisions in the group. Janis’s focus on clearly demarcated or explicitly bounded advisory groups bolsters its heuristic scope and the apparent clarity of its findings by assuming that the group is a simple task unit, not a complex coalition of neophytes, veterans, and elites. Indeed, Janis not only assumes that we all know who is in the decision-making group, but that within the group there are really only two classes of players or decision makers: the leader and the followers. The transfer of this assumption to cults is problematic.

There are no officially recognized problem-solving groups in a cult. Unlike government bureaucracies or corporate structures which clearly designate decision-making functions, replete with responsibility to particular departments or groups, the same cannot be said for cults. Cults are opaque. The secrecy that shrouds them is part of their adaptation to a world they believe to be hostile to their beliefs. The very ideological and/or religious underpinnings of cults as formal organizations often muddy efforts to locate and understand precisely which group of men or women within a cult are responsible for its key decisions.

Then, too, the decision-making group within cults cannot be as easily divided into leader and follower groupings as can the policy advisory groups that inform Janis’s groupthink model. In the policy groups, the hierarchy of expertise, following from the first assumption, is clearly rooted in decision-making responsibility. Thus, to have power within Janis’s decision-making or problem-solving groups is to have decision power. To have power in cults is to have influence, not only on the decision-making processes of the cult members, but to have referent power over many of the beliefs of cult members and their families. Referent power refers to the desire of members to be like or model after the charismatic head of the cult.

This general influence over members rather than the direct decision power that prevails in Janis’s advisory groups means that charismatic cult heads often influence their members in varying degrees. Not all followers are equally committed. The class structure within a cult is complex, and factors like proximity to the charismatic head or the select members of the cult mean that the simple distinction between leader and follower used in Janis’s advisory group must be attenuated and filled in. There is the cult leader, the cult elite or the leaders’ key figures, the senior cult members, recent members, probationary members, and aspirants. The robustness of classes of followers in a cult requires at least the recognition that to study groupthink within a cult, one must be able to identify the members who make up the cult’s decision elite.

The third assumption on which the groupthink model is buttressed arises out of the distinction between economic and social psychological models of decision making by groups of individuals. The economists’ models, essentially following the work of Arrow (1951), utilize an axiomatic approach to group decision making nested within a framework of either social choice or game theory. While this literature is rooted in the assumptions of economics, the implications are political and strategically self-serving. Certain players capitalize on their knowledge of the game and its structure to serve interests which may be hidden from others within the group.

On the other hand, social psychological models like Janis’s ignore the strategic and intentionally manipulative aspects of the manner in which group decisions are structured and focus on the unintended consequences of unbridled social influence, particularly excessive or overpowering group identity. Thus in Janis’s groupthink model, the excessive group cohesiveness, structural faults or lapses, and stressful or provocative context are not orchestrated by some at the expense of others. Rather, in Janis’s groupthink model, these antecedents evolve. They are not directed by the cunning of a few.

In cults there is, as students of cults well know, an ongoing and at times shrill debate about whether cults are well-intentioned experiments in our species’ quest for a more fulfilling or spiritual path, which at times unintentionally go astray, or whether cults are traps designed by the cunning to catch unsuspecting seekers. Janis’s model suits the former designation, but fails to capture the latter. Thus the topic of brainwashing or deprogramming of cult members (Dubrow-Eichel, 1989; Langone, 1993) is possible in the strategic model, but actually is circumvented in social-psychological models. In the latter, as in Janis, the outcomes of group membership are not planned or desired by some, but rather emanate as unintended consequences of group processes gone wrong.

While I cannot say beyond a shadow of doubt which of these two models is superior, I can unequivocally point out that Janis’s groupthink model cannot account for both points of view. As a consequence, we may, when using the groupthink model to explain decision making in cults, find that the absence of strategic behaviors does not permit an open scrutiny of all possible explanations.

In line with the evolution rather than the conscious manipulation of the antecedents in Janis’s groupthink model is the fourth and final assumption--namely that group problem solving or decision making is and ought to be a rational, linear, and scientific-like set of procedures. In Janis’s model, this rational process is presaged by the selection of articulated goals, the gradual and careful elimination of the less than maximizing alternatives, and a thorough-going information search. Janis employs a notion of decision-making rationality which, he insists, not only lessens the possibility of going astray in the making of decisions, but also is the one actually used in healthy or functional decision-making groups. One may not want to quarrel with the assertion that scientific-like procedures in decision-making processes by groups lessen the probability of error; however, it is a far jump from this assertion to the assumption that healthy groups actually use these procedures. Janis has, I feel, articulated a notion of decision-making rationality which is too unrealistic to be helpful in most applied settings, never mind in the area of cults. Indeed, Janis’s depiction of decision making, while a valued classroom heuristic, may be used, even in very healthy decision groups, far less than Janis leads us to believe.

One thing is certain, when groups use retrospective sense making to tell others how they made their decisions, they will, on the whole, overemphasize the rational and sequential stepwise way in which they arrived at the decision. This “hindsight bias” will, for obvious reasons, be most pronounced when the outcome of the decision is aligned with the group’s goals or, stated another way, when the problem confronted by the problem-solving group is solved. However, cults are not problem-solving groups. Moreover, cults make, at least to nonmembers, weak rationality claims. Groups with a strong set of rationality claims are those that have as their aims future goals highly desired by the dominant society or mainstream and attempt to achieve these with approved-of and efficient means or processes. Since cults by their very definition march to a different drummer and seem rather proud of this fact, it would be odd to use the canons of rationality so evident in Janis’s groupthink model to make sense of cult decision making.

Modifying the Groupthink Model

To meet the problems of cults and cultic decision making, serious modifications in Janis’s groupthink model are required. Moorhead and Montanari (1986) point out that empirical studies on varying groups are in agreement on just one thing. The antecedents--group cohesiveness, structural faults, and the existence of a crisis or provocative context--are the particular weak part of the model. This, I believe, is particularly so if we seek to apply Janis’s groupthink model to understand erratic decisions made in cults. In Figure 2  (“Janis’s Groupthink Model for Cult Studies”)  I outline four key areas where modification of the original groupthink model (see Fig. 1) is required to suit the four difficulties discussed in the previous section.

Janis defined groupthink as “a mode of thinking that people engage in when they are deeply involved in a cohesive group, when the members’ strivings for unanimity override their motivation to realistically appraise alternative courses of action” (1972, p. 9). As Longley and Pruitt (1980) point out, this definition is confusing not only because it is tautological or defines the process itself by its antecedents, but also because it is so insistent that group cohesiveness explains so much of the process. Indeed, upon closer scrutiny it becomes apparent that, although Janis leans heavily on the concept of group cohesiveness to make his case, there is little in the analysis that explores how group cohesiveness works in producing concurrence thinking and, most important, whether it works similarly in all cohesive groups.

The vagueness of Janis’s use of the group cohesiveness explanation becomes extremely problematic when we come to cults. First, Janis assumes, based on his field studies, that the actual decision-making group is easy to identify. After all, the group is extremely cohesive and marks its boundaries well. This is problematic with cults. The cult is easy to identify, but the group within it that has the power to make decisions for and in the name of the cult is much more difficult to identify. In cults, a small elite--one might say a power elite--composed most typically of a charismatic leader and his or her selected advisors composes the decision-making group. For reasons not unlike that of many political policy advisors, secrecy is often utilized to protect the decision power elite from outside scrutiny.

Janis never focuses on the privilege and the power of being one of the selected members of the power elite and how the privilege of restricted membership may go to the heads of many individuals and aid them in subduing their critical impulses. In cults, the power elite who make the key decisions are true believers. They have risen or been selected to their central position because they have proven to other members of the elite decision group that their central life interest is the cult and its pervasive belief system. The cohesiveness of the decision elite is one acquired during tests of loyalty and typically a long and arduous socialization period. The key to becoming an elite member in a restricted group rests in the ability to pass “ordeals.” As Nock (1993) notes, an ordeal is a form of ritual used to determine whether or not an individual is trustworthy and truthful in the ways of the group. “Through an ordeal,” writes Nock, “people are able to validate their reputation, to garner proof of the validity of their claims ... indeed by >passing’ an ordeal, one establishes or sustains a claim to membership in the group endorsing the ordeal” (Nock, 1993, p. 15).

Janis’s neglect of the ordeal of becoming a part of the decision elite is rooted in two issues. First, Janis is dealing with expert groups. These “policy advisors” utilize their credentials as ordeal surrogates. These credentials, say from Harvard or the Sorbonne, establish the reputations and trustworthiness of the advisors. Second, Janis fixes his attention on the external crisis or “problem” that confronts the advisory group and not the “problem” of becoming a member of the elite. These two issues must be modified in adapting the orthodox groupthink model. The charismatic leader in cults uses ordeals to test the loyalty, commitment, and resilience of the belief system of those who would be decision elites.

This is a far cry from the use of the credentials offered by someone from Harvard to the person from the Sorbonne as a test of one’s membership. The expert group develops a clubbiness and internal worldview based upon its belief that it truly represents the best thinking the world has to offer. This complacent, noncritical attribution of “world classness” is, to a large degree, shared by the world. This is precisely why we are puzzled by policy advisors who escalate bad decisions or throw good money after bad. It is why the policies surrounding the Bay of Pigs, Watergate, Irangate, and the escalation of the Vietnam War puzzle us.

In cults, the development of cohesiveness takes a different route. The ordeal is conferred upon the individual by the cult in the direct name of the charismatic leader. The test or ordeal is not that of Harvard or the Sorbonne, but that of the cult. It holds little of what sociologists call universalistic criteria. Rather, it is particularistic. It bonds the individual to the cult and only to the cult. It demands that the cult become the central life interest and primary nexus of its members. This, of course, is intensified for those particular members who seek access into the cult’s decision elite.

Deindividuation, the second major modification of Janis’s groupthink model, replaces the structural faults (see Fig. 1) as an antecedent to concurrence thinking. According to Zimbardo (1970), deindividuation is an intrapsychic state characterized by an absence of self-awareness and a tendency to engage in uninhibited activities. Within the framework of explanations of collective behavior (Rose, 1982; Turner & Killian, 1987), deindividuation is seen as arising when individuals become full-fledged members of a new group, ideology, or belief system, shedding old behavior guidelines and adopting a new set of emerging norms. These new fledgling norms enable the individuals, within the context of the new group, ideology, or belief system, to engage in behaviors otherwise precluded by their conformity to their previously held norms.

Deindividuation, in Janis’s advisory policy-making groups, is not discussed. Janis attempts to explain the concurrence seeking as a gradual evolution of conformity to group norms without making clear what happened to diminish the integrity of an individual’s norms of criticality and care-to-detail. It is assumed in Janis’s work that the emergence of a provocative situational context replete with stress and a high likelihood of moral quandaries and incidence of low self-esteem works to threaten individuals and cower them into submission to the group. This process is aided by the fact that Janis’s group included specialists in the same area of expertise, drawn from the same social class, and often educated to act in the same political and social contexts.

In cults, deindividuation is rooted deeply in the raison d’être of the cult. It is not a by-product of situational, contextual, or social factors that happen to coalesce. Cults exist to provide individuals with a change in their belief systems, behaviors, and locus of control. Cults cut new members off from contact with their families and earlier friends; focus attention on noncult members as dupes, lesser beings, or even, at times, the enemy; and move new members about geographically in order to minimize social anchors. The cult, it is clear, practices social control over its members. It does so by mobilizing knowledge of the individual’s past--the tools are guilt, fear, and shame--to create the image of a far more attractive future. This future, so the true believer is convinced, can only be attained by submitting to the norms of the group as a whole.

These norms are set by the decision-making elite. Here is where we locate the third major modification to Janis’s groupthink model. Once deindividuation has been achieved, the individual can be trusted to make decisions for the group as a whole. The key here is that trust is extended to individuals once they have proven that they no longer merely comply with the norms of the cult but enthusiastically conform to the group’s agenda. Monitoring and close supervision of new members diminish as this conformist enthusiasm flowers. The growth of conformity moves along a continuum from desire to attain the proposed goals of the cult, to fear of punishment for transgressing the norms of the cult, to compliance with the norms and, finally, to enthusiastic conformity to the infallible virtues of the cult’s elite and, of course, its leader.

In Janis’s model, the continuum from compliance to enthusiastic conformity is not made explicit. This is because Janis is interested solely in the premature group consensus that emerges in groupthink. This consensus is a by-product of dysfunctional social influences that distort and lower the quality of what, in Janis’s eye, is the true function of the group: making high-quality rational decisions.

In cults, the premature consensus is more complex because the student who utilizes the cult as a decision-making group fails to make explicit that it is consensus that is the goal of the cult and decision making is the by-product. The continuum from early member compliance to, in time, enthusiastic conformity can be called the socialization process of successful cults. In cults there is an insistence upon a correct and singularly clear path that all members progress along in order to become more central to the cult. This single path and the progress along it reduce the gnawing complexity of issues for those who come through early forms of compliance to enthusiastic conformity. The path provides acolytes with a sense of direction, righteousness, and superiority, all to varying degrees either absent or questioned in the members’ precult lives.

This internalization of full and enthusiastic conformity with the single path of cult life gives rise to the fourth modification of Janis’s groupthink model. The decision-making elite within the cult, composed of the leader and his or her entourage of true believers, is not held to account to or be monitored by cult members. The highly insulated and homogeneous decision-making group is adored or held in awe by followers. The leaders’ views of the world are to be learned and repeated, not parsed, examined critically, or held up to tests of rationality. Janis’s groups of advisors have as their central occupational preoccupation the task of critically examining information. The task of the cult elite is to win enthusiastic conformity from members of the cult. To accomplish this, the cult elite has vested interests in exacerbating the distinctiveness and therein superiority of cultic behaviors and in disparaging the apparent blind conformity of noncult members. The cult elite must establish the conspiratorial efforts of noncult members to interfere with the solidity, beauty, and truth of the single path offered by the cult. The world’s (noncult members’) indifference to the cult would, from the point of view of the cult elite, provide little opportunity for increasing the membership base of the cult. To grow and capture the eye of potential members, the cult must confront conventional wisdom.

The result of this withdrawal into a confrontative stance with the cult members applauding and aching to hear each and every word of the leader and his or her emissaries is that decision making gets detached from reality. There are no checks and balances here. Adoration abounds within; confrontation pervades the cult from without. The confrontation is taken as proof by members that the cult requires a break from the standard canons of rationality. Faith, it can be argued, roots itself most firmly when adversity is present, but can, with commitment “fully” to the cult, be kept at a distance. The internal adoration affirms and reinforces the growth of decision making and behaviors which, over time, depart more and more from the rational canon.

The result is decision making impulsivity through concurrence in cults. The dynamics of the groupthink model are altered to suit the psychosocial dynamics of cultic life. The risky decisions that emerge from cults can be, I contend, comprehended with amendments to deal with the idiosyncrasies of specific cults through an application of Figure 2 and not a holistic importation of Janis’s model as depicted in Figure 1. The study of cults is important in that they are a microcosm of influence processes. In Janis’s reliance on cohesiveness we have, I believe, the kernel of a truth that must be attenuated and nuanced before it can be relied on as the explanation of the decision-making processes in cults.

Conclusion

Modified, the groupthink model pioneered by Janis is an excellent starting point to understand the decision-making process within cults. The expanded model focuses attention on the manner in which ordeals, individuation, and enthusiastic conformity create the conditions for a decision elite within a charismatic cult to fully enter the conditions of concurrence seeking outlined by Janis in his groupthink model. These conditions are facilitated by the fact that cult members do not monitor the decision-making abilities of the cult’s elite.

In this highly insulated environment, fueled by a belief in its own superior value system, the cult’s decision elite is driven, particularly when it feels threatened by noncult members, to impulsive decisions. The impulsive decisions are grounded in wild premises and conspiratorial conjectures. The cult does not look closely or analytically at the consequences of its decisions, believing that it is immune to negative consequences. This perceived immunity is bolstered by the cult’s belief that it has virtue and the strength of the “true” power(s) on its side. Indeed, negative consequences may be interpreted as small setbacks, tests, or rewards in themselves. The ability of cults to frame and interpret events, even negative consequences emanating from cult decisions, is not to be underestimated.

The student of cultic decision making is exploring a phenomenon which may be at odds with the ideology or religious roots of many citizens. These cults envision the cult as no mere assemblage of decision makers, but as a holy quest. The cultic path is more than the sum of past decisions; it is an acting-out of the very forces scripted deeply into the adherents’ behavioral repertoire. Decision makers may be seen as or understood as interpreters or conduits of the “true way.” In these contexts, Janis’s antidotes to groupthink hold little chance of reducing the incidents of groupthink. While it may be possible to introduce a devil’s advocate or critical discussion groups in formal policy-making groups, it is not possible in cults. Cults do not seek to make sense of and integrate critics’ voices. Theirs is a worldview that assumes that its very superiority is proven by the persistent existence of vociferous critics.

References

Aldag, R.J., & Fuller, S.R. (1993). Beyond fiasco: A reappraisal of the groupthink phenomenon and a new model of group decision processes. Psychological Bulletin, 113, 533B552.

Arrow, K. (1951). Social choice and individual values. New York: Wiley.

Buss, D.D. (1993, September 13). Parents edgy because of classroom groupthink. Christianity Today, 37, 52B54.

Cerami, C.A. (1972, December). Group thinking: A pitfall for any company. Nation’s Business, 58B60.

Cline, R.J. (1990). Detecting groupthink: Methods for observing the illusion of unanimity. Communication Quarterly, 38, 112B124.

Deutsch, M., & Krauss, R.M. (1965). Theories in social psychology. New York: Basic Books.

Dubrow-Eichel, S.K. (1989). Deprogramming: A case study. Part 1: Personal observation of the group process. Cultic Studies Journal, 6, 1B117.

Esser, J.K., & Lindoerfer, J.S. (1989). Groupthink and the space shuttle Challenger disaster: Toward a quantitative case analysis. Journal of Behavioral Decision Making, 2, 167B177.

Galanter, M. (1989). Cults: Faith healing and coercion. New York: Oxford University Press.

Guzzo, R.A., & Salas, E. (1995). Team effectiveness and decision making in organizations. San Francisco, CA: Jossey-Bass.

Hart, P.T. (1990). Groupthink in government: A study of small groups and policy failure. Amsterdam, The Netherlands: Swets and Zeitlinger.

Hart, P.T. (1991). Irving L. Janis’ victims of groupthink. Political Psychology, 12, 247B278.

Henderson, J. (1987). Getting rid of groupthink--Let’s change the legislative process. Canadian Medical Association Journal, 136, 881B883.

Huseman, R.C., & Driver, R.W. (1979). Groupthink: Implications for small group decision making in business. In R.C. Huseman & A.B. Caroll (Eds.), Readings in organizational behavior: Discussions of management actions. Boston: Allyn & Bacon.

Janis, J.L. (1971, November). Groupthink. Psychology Today, 43-46, 74-76.

Janis, J.L. (1972). Victims of groupthink. Boston: Houghton Mifflin.

Janis, J.L. (1982). Groupthink (2nd ed.). Boston: Houghton Mifflin.

Katzenbach, J.R., & Smith, D.K. (1993). The wisdom of teams: Creating the high performance organization. Boston: Harvard Business School Press.

Kinlaw, D.C. (1991). Developing superior work teams: Building quality and the competitive edge. Lexington, MA: Lexington Books.

Langone, M.D. (Ed.). (1993). Recovery from cults: Help for victims of psychological and spiritual abuse. New York: W.W. Norton.

Longley, J., & Pruitt, D.G. (1980). Groupthink: A critique of Janis’ theory. In L. Wheeler (Ed.), Review of personality and social psychology. Newbury Park, CA: Sage.

McCauley, C. (1989). The nature of social influence in groupthink: Compliance and internalization. Journal of Personality and Social Psychology, 57, 250B260.

Moorhead, G., & Montanari, J.R. (1986). An empirical investigation of the groupthink phenomenon. Human Relations, 39, 399B410.

Moorhead, G., Ference, R.J., & Neck, C.P. (1991). Group decisions fiascoes continue: Space shuttle Challenger and a revised groupthink framework. Human Relations, 44, 539B550.

Neck, C.P., & Manz, C.C. (1994). From groupthink to teamthink: Toward the creation of constructive thought patterns in self-managing work teams. Human Relations, 47, 929B939.

Nock, S. (1993). The costs of privacy: Surveillance and reputation in America. New York: DeGruyter.

Rose, J.D. (1982). Outbreaks, the sociology of collective behavior. New York: Free Press.

Rosenblum, E.H. (1982). Groupthink: One peril of group cohesiveness. Journal of Nursing Administration, 12, 27B31.

Sanders, B.C. (1980). Avoiding the groupthink zoo. Supervision, 42, 10B13.

Sims, R.R. (1992). Linking groupthink to unethical behavior in organizations. Journal of Business Ethics, 11, 651B662.

Turner, R., & Killian, L.M. (1987). Collective behavior. Englewood Cliffs, NJ: Prentice-Hall.

Wynden, P. (1979). Bay of Pigs: The untold story. New York: Simon & Schuster.

Zimbardo, P.G. (1970). The human choice: Individuation, reason and order versus deindividuation, impulse and chaos. In W.J. Arnold & D. Levine (Eds.), Nebraska symposium on motivation. Lincoln: University of Nebraska Press.

*******************************************

Mark N. Wexler, Ph.D., is Professor of Policy Studies and Director of Research in the Faculty of Business Administration at Simon Fraser University in Burnaby, British Columbia, Canada. Wexler’s research focuses on manipulative persuasion used in corporate contexts.