Articles‎ > ‎

Avoiding Extremes in Defining Extremist Cult

Avoiding The Extremes in Defining The Extremist Cult

Cultic Studies Journal, 1984, Volume 1, Number 1, pages 37-62.

Stephen M. Ash, Psy.D.


A polarization of opinion regarding the nature of cult conversion has resulted in considerable confusion about the proper definition of a cult. This paper investigates this problem by critically examining the view that cults are no different from other religious groups (the “pro-cult” position) and the view that cults indeed pose special problems for society (the anti-cult position). The latter view is further analyzed in terms of two metaphors, the “possession” metaphor and the “deception” metaphor, which is seen as the most balanced and accurate of the three positions. Criteria (presented in a format based upon DSM-III) for defining an extremist cult are proposed.

Introduction

Since Jonestown the public has seen a virtual explosion of interest in cults. Unfortunately, a very large part of the professional mental health literature regarding this phenomenon appears to have been relegated to philosophical debate or seems to have been written from a biased perspective. In Marjory Zerin’s recent review of the book Cults and The Family (Kaslow and Sussman, 1982) in the Cultic Studies Newsletter (1982), she commended the editors for taking into account the “polarization” [which] has characterized much discussion to date concerning the cult phenomenon by including “contributions reflecting a spectrum of perspectives vis-à-vis cults from the ‘anti-cult’ bias…to the ‘pro-cult’ positions,” and even including two works which “fall somewhere in between with their more neutral stance” (p.7). This polarization has not only skewed research, but has, on the one hand, led parents to overreact and therapists to make unethical compromises, and, on the other hand, has led to ineffective therapy and even outright denial of genuine problems.

This paper is an attempt to address this problem of extreme philosophical presuppositions in the study of the cult phenomenon by reviewing the literature and then offering five propositions to help systematize our thinking and bring out presuppositions into line with the reality suggested by the literature. Finally, a definition of an extremist cult is offered by which it may be differentiated from noncultic religions and possibly nonextremist cults as well.
Background – “Religious Wars of the Seventies”

Following the tremendous rise of the cults in the sixties and seventies came an expected backlash reaction to them from orthodox religions and parents who had lost their children to the new religious groups. Shupe and Bromley in their recent sociological analysis of the opposition to the cults, )The New Vigilantes, 1980) have termed the “loose network of regional organizations” opposed to the cults “the anti-cult movement (ACM)” (p.25). This network, they say, “emerged as two interdependent but distinct components…which we have terms, respectively, the anti-cult associations and the deprogrammers” (p.13, italics the authors’). Their term “anti-cult movement,” or ACM, has been used elsewhere by Shupe in his writings; cf., Shupe and Bromley, 1978; Shupe, Spielmann and Stigall, 1980; as well as by Beckford (1979) in a separate analysis in England, “Politics and the anti-cult movement.”

Not all those who have written on ACM appear to be objectively neutral in their analysis. In his critique of this counterattack on the ACM, “Cult/Countercult: Is either side fighting fair?” Enroth (1979b) spoke of “civil libertarians” who, in their defense of religious freedom, “so easily dismiss the possibility of mind control and the destructive dimensions of the cults that they are, for the most part, observers from afar, wearing dark glasses” (p.34). Burtner concurred with the term “civil libertarians,” speaking of “rampant egalitarianism” as characterizing “those who will not take a critical stance about religious cults.” This practice, he said, “stems from a refusal to look at the facts” (1980, cassette tape A-1183, side 1 – hereafter t.3, s.1).

In addition to Enroth and Burtner, Shupe in two different writings (Shupe and Bromley, 1980; Shupe, Speilmann and Stigall, 1980), pointed the finger specifically at the American Civil Liberties Union (ACLU) as one group zealously attacking the ACM at the expense of an objective view of reality. Besides the various cult organizations themselves, Enroth (1977b) mentioned several other organizations apparently guilty of this tunnel vision tendency, including the Americans United for Separation of Church, the Alliance for the preservation of Religious Liberty, the United Families Committee (organized by professors from the Toronto School of Theology), and the national Council of Churches.

Ross (1982) went even further by identifying a cluster of authors whose work he considers “gravely deficient” due to their biases “against the anti-cult movement” which is heightened by their citing each others’ works favorably or uncritically, while overlooking or downplaying “the problems with cults that their opponents think vital” and “rarely featuring [the ACM’s] strongest arguments or putting its speakers in a favorable light” (pp1-2). Ross identifies the leaders of this cluster of writers to be James T. Richardson, Dick Anthony, Thomas Robbins, Anson D. Shupe, Jr., and David Bromley. The results, Ross contended, are methodologically deficient research and biased editing in such professional journals as the American Behavioral Scientist, the Journal for the Scientific Study of Religions, and Sociological Analysis.

Naturally, these “intellectuals against the anti-cult movement” (Ross, 1982) claim that those in the ACM suffer from a similarly distorted sense of perception. The reason for their malady, the ACM critics claim, is the concept of “brainwashing” and its resulting counterpart, “deprogramming.”

In one particularly scathing attack, Robbins and Anthony (1978), who coined the phrase for this chapter section heading stated:

The religious wars of the seventies have involved accusations that new religious movements brainwash their converts. They are alleged to be using mind control in seducing your persons from conventional familial processes and career plans so as to psychologically imprison them in communes and monasteries. (p.77).

Hargrove (1980) criticized the brainwashing concept as “the evil eye theory appropriate to a modern scientific culture” – bewitchment with psychological technology (p.22). She notes, “there is in this ‘evil eye’ theory no more place for rational decision-making or personal freedom of choice than could be found in the old theories of witchcraft, sorcery, and possession” (p.22). Shupe and Bromley (1980), likewise viewed brainwashing to be a secular model metaphor quite comparable to the metaphor of possession in the religious model.

This comparison of the brainwashing view of cult conversion with witchcraft and demon possession has led ACM critics to warn of a mass hysterical overreaction to cults which is likened to the witch hunts in past centuries (Anthony, Robbins & McCarthy, 1980; Beckford, 1979, 1981; Hargrove, 1980; Levine, 1979; Robbins and Anthony, 1978; Shupe and Bromley, 1978, 1980; cf. Sargant, 1957). This overreaction is likely to be the most troublesome in the area of deprogramming.

Deprogramming is the second major focal point of these continuing religious wars. Shupe, Spielmann and Stigall (1980) reported,

The practice of deprogramming is unquestionably the single most publicized issue connected with the ACM. Some, including the American Civil Liberties Union, much of the press, and a number of sociologists, have erroneously identified the entire ACM with advocates of this one sensational tactic (p.46).

Following the possession metaphor reasoning, the solution, according to Hargrove (1980) and Shupe and Bromley (1978), would be simply to exorcise the evil influence – which they claim is the essence of deprogramming. Paradoxically, while all civil libertarians deny the genuine existence of brainwashing in cult conversion (e.g., Anthony, Robbins and McCarthy, 1980 and Thomas Szasz whom they quote; Galanter, Rabkin, Rabkin, and Deutsch, 1979; Gordon, 1977; Hargrove, 1980; Rice, 1976; Robbins and Anthony, 1978; Thomas, 1979), many of these same ACM critics (as well as some in the ACM) also have seen deprogramming as a from of “reverse-brainwashing” (e.g., Anthony, Robbins and McCarthy, 1980; Hargrove, 1980; LeMoult, 1978; Levine, 1979; Maleson, 1981; Pattison, 1976; Rice, 1976; Richardson, 1980; Robbins and Anthony, 1980; Sage, 1976; Shupe and Bromley, 1980; Stoner and Parke, 1977; Yamamoto, 1977).

The reason for this accusation is that the vast majority of those who see deprogramming as reverse-brainwashing have defined it in a rather narrow sense, restricting it to the more coercive techniques generally associated with brainwashing. For example, Levine (1979) used the term to involve, among other things: “coercion-utilizing subterfuge, false pretenses, or force to lure the unwilling cult member to a private location…; detention…; browbeating; constant input…; little or no letup” (p.600). An even more potentially damaging example of restricting the term to coercion (Shupe and Bromley, 1980, p.123 withp.201), however, has been the booklet Deprogramming: Documenting the Issue (APRL, 1977), produced under the joint sponsorship of the ACLU and the Toronto School of Theology, who combined to form the Alliance for the Preservation of Religious Liberty (APRL).

This booklet includes a reproduction of a pamphlet entitled “Deprogramming: The constructive destruction of belief; A manual of technique.” Alleged to be a deprogrammer’s “do-it-yourself” manual, it advocates many of the same techniques seen in classical brainwashing: starvation, sleep deprivation, shame inducement through nudity, physical and verbal abuse, sexual coercion, and the destruction of “holy works” (i.e., cult artifacts).

Burtner (1980), Enroth (1977b), Heller (1982), and MacCollam (1979), who separately reviewed this “deprogrammer’s manual,” all see it as a product of cult propaganda designed to discredit the work of deprogrammers. Enroth and MacCollam both reported that many of the alleged sponsors of the pamphlet, including the (angelican) Church of England and the Evangelical Alliance, deny any connection with it and Enroth has added, “have asked that their names be deleted” from it (p.20).

Burtner stated that “many of the APRL members happen to be Scientologists or cultists of different groups” (t.2,s.1). MacCollam echoed this by being even more specific: “The editor of this…is a theologian with dual credentials: he is a professor at the ?Toronto School of Theology and holds a similar academic position at the seminary of this country’s most vocal and powerful cult,” i.e., the Unification Church (p.123). Consequently, he doubted the “editor’s ability to remain objective and fair to all sides of the issue” (p.123).

Shupe and Bromley (1980) themselves restricted the use of the term deprogramming to the most coercive techniques (p.123). They nonetheless have admitted that the word does convey many different meanings to different people. Although some deprogrammers do utilize practices that approach the coerciveness of brainwashing or thought reform, some are more gentle, like the “reevaluation” of Rabbi Maurice Davis and Father Kent Burtner, which they admitted is “strongly endorsed” (p.84) by Stoner and Parke (1977), cf. pp.351-367). Shupe has confirmed this admission in a separate writing (Shupe, Spielmann and Stigall, 1980): “the term deprogramming carries, for ACM supporters, a range of meanings, from the publicized coercive extreme to a simple phone call” (p.46).

To further counter the allegation of violence in most deprogrammings, MacCollam (1979) has asserted that “the classical tools of brainwashing (lack of sleep, reduced caloric input, sensory bombardment, the inability to ask questions, limited access to toilet facilities and other means of personal degradation) are not used in responsible deprogramming, nor are they needed!” (pp.119-120). He has been supported in this counterassertion by psychologist Marvin Galper (in Aversa, 1976), by William West (1976) of the International Foundation for Individual Freedom, and by R.K. Heller (1982), who wrote his own deprogrammer’s manual. Furthermore, MacCollam admitted that some “bad” deprogrammings do exist, but was quick to add that they are “by far the exception rather than the rule” (pp.117-118), a claim which has been supported by Burtner (1980, t3, s.2), by Heller (1982, p.100), as well as by extensive research by psychologist Margaret Singer.

Out of approximately 100 persons who had taken part in Singer’s rehabilitation groups, many of whom “had seen deprogrammers…none in her groups cited experiences of the counter-brainwashing sort” (1979, p.75). Enroth has concurred. “In my own extensive contacts with ex-cult members who have gone through the deprogramming process, I have found no evidence to support allegations of ‘torture sessions’” (1977b.p.20)

MacCollam has suggested that one major “source of ‘bad’ deprogramming arises from the well intended efforts of some former cult members and other highly motivated individuals who lack either the psychological or theological expertise to accomplish any sort of positive repersonalization” (p.118). the lack of professional credentials of deprogrammers is a concern of several others (Enroth, 1977a; Hargrove, 1980; Levine, 1979, p.600; Shupe and Bromley, 1980, p.138; Stoner and Parke, 1977, pp.369,427). MacCollam (1979) has added, however, that “the majority of deprogrammers are teams of qualified theologians and mental health professionals who willingly submit their work to peer evaluation” (p.119).

Some of the mental health professionals who do provide deprogramming or reevaluation services have spoken to the difference in goals between brainwashing and what they do. Psychologist Marvin Galper (In Aversa, 1976) has stated:

Brainwashing is to implant definite attitudes and beliefs into the person by creating stress and psychological pressure, while the purpose of deprogramming is to help the person regain the ability to make his own free choices. The therapist helps him to think for himself again. (p.1, Citizens Freedom Foundation reprint).

Psychiatrist Bijan Etemad (1978) has concurred: “the goal is to change the mind of cultists by helping them to think for themselves rather than depend on a leader” (p.222). “The examination of what the person already believes is the deprogrammer’s goal, rather than trying to force him to adopt a new belief” (West, 1976, p.75). Conway and Siegelman (1978) have seen the goal as not a narrowing of the mind as brainwashing is, but its enlightenment (p.69). Psychologist Kevin Gilmartin (in Sage, 1976) has seen it as “reality inducing therapy” (p.47).
The Central Issue

With brainwashing and deprogramming as the two major focal points of these religious wars, the central issue over which the ACM and the civil libertarians appear to be fighting is the age old controversy of free will versus determinism, particularly environmental or social determinism. In a recent review of the literature on this issue, Furlong (1981) commented that “during the past century various theories of mental functioning that rest on a deterministic view of man have gained dominance” (p.435). He continued, “some recent cult systems of psychotherapy (and religion) may be seen as successful partly because they remove individual choice and its attendant anxiety from the individual and subsume it under a group ethic” (p.439).

Thus, while Furlong has concluded that both free will and social determinism are involved in cult conversion, the critics of the ACM attack it because they believe the movement overemphasizes determinism. They claim it rallies around a banner of the passivity of cult members. They contend that the ACM holds cultists to be “unwilling victims of external agencies beyond their control” (Beckford, 1979, p.176), thus presenting “a view of humankind as incapable of decision-making or any exercise of the will” (Hargrove, 1980, p.24). The terms “cult,” “brainwashing,” “mind-control,” and “deprogramming” are, therefore, seen as extensions of this assumed ACM view of man as deterministic (e.g., Beckford, 1979; Hargrove, 1980; Robbins and Anthony, 1978, 1979).

Some in the ACM do hold to a deterministic view of man. For example, Paul Verdier, in his book Brainwashing and the cults: An expose on capturing the human mind (1977), declared that “Free will is a myth..We are all really at the mercy of whoever has the knowledge and the dedicated determination and facilities to subvert us to their will” (p.88).

On the other hand there are those in the civil libertarian camp who have made statements just as extreme in the opposite direction. A case in point is Thomas Szasz. He has been quoted as saying:

Brainwashing is a metaphor…A person can no more wash another’s brain with coercion or conversations than he can make him bleed with a cutting remark…However, we do not call all types of personal or psychological influence brainwashing. We reserve this term for influences of which we disapprove. (Anthony, Robbins and McCarthy, 1980, p.39 cf. Robbins and Anthony, 1978, p. 77)

The problem with extreme positions is that they generally provoke extreme reactions. With an ultra-deterministic view of cult conversion there is the danger of parental hysterical overreaction with the resulting potential of therapist vulnerability to unethical compromises. Yet, with an ultra-volitional view of cult conversion there is the danger of ignoring or overlooking its destructiveness, with the resulting therapy or referral outcome being ineffective at best. Therapeutic interventions are rarely effective, and sometimes even do more harm than good, when they are handicapped by inaccurate/deficient assessment due to a denial of facts, prefabricated explanations, a priori assumptions, or downright fear of involvement on the part of the therapist.
Armistice Via Balance

Anthony, Robbins, and McCarthy (1980) have spoken of “a continuum of psychiatric attitudes [which] can be identified regarding the viability of the brainwashing notion and its application to contemporary religious movements [cults]” (p.39). At one pole, or extreme end, they have placed Thomas Szasz with his assertion that “brainwashing is a metaphor.” At the other end they have placed “a number of psychiatrists and psychologists who affirm not only that mind control, brainwashing, and psychological kidnapping are meaningful and viable scientific concepts, but that such notions may easily be generalized from contexts involving tangible and overt intimidation (e.g., POW camps) to formally voluntary religious contexts” (p.39). They then gave psychologist Kevin Gilmartin as an example, asserting that his inferring the dysfunctional mental state or psychological impairment seen in cult-members “is totally and exclusively socially induced” (p.40.

This last assertion is debatable, simply because Anthony et. Al. also admitted that Gilmartin has stated that the convert “relinquished” and “acquiesces” his ego functioning to the cult (p.39). Gilmartin, then, cannot be said to have suggested that the convert has no free choice in this matter, but instead, that he freely chooses to give up his choice and free will. As Furlong (1981) in his discussion on free will versus social determinism has also suggested, both forces are relevant; both do appear to be present in cult conversion. This, then, is the first proposition of this paper.

Accordingly, the second proposition of this paper is that professional mental health attitudes toward cults (and brainwashing and deprogramming) do lie on a continuum as Anthony et. Al have suggested, with the extreme poles being the ultra-free will and the ultra-social determinism banners. Szasz may be seen at one end with his “brainwashing is a metaphor.” However, it is not Gilmartin, but Verdier with his “freewill is a myth” who lies at the other.

In their analysis of the ACM, Shupe and Bromley (1980) have spoken of “a range of metaphors” for cult conversion, but choose to dichotomize the options into the “possession” and “deception” metaphors (pp.60-61; see Figure 1). Thus, even though they chose to emphasize the possession metaphor more (with its “absolute influence” and connected terms of brainwashing and deprogramming), they allowed for the presence of less extreme positions within the ACM movement by including the deception metaphor (with its “indirect control through exploitation of human weakness and related reevaluation;” cf. their Table 3.1, p.61).

Free Will Social Determinism

“Deception” “Possession”

View of cult conversion “Normal” Metaphor Metaphor



(Szasz) (Verdier)

View of Deconversion Not Needed Voluntary gradual Forced coercive

Counseling: exit reevaluation type marathon

Counseling deprogramming



Civil Liber- Anti-Cult Movement Views

tarian Views

Figure 1: A continuum of professional mental health attitudes regarding cult conversion and deconversion via deprogramming or reevaluation/exit counseling.

Just as their research showed deprogramming to have many meanings, so have they confirmed that those in the ACM have many different attitudes toward the meaning of both cult conversion/brainwashing and reevaluation/deprogramming.

Removing the dichotomy from their range of metaphors, therefore, presents a picture of ACM advocates lying on various points along the cult-attitude continuum, although the distribution may be naturally skewed toward the deterministic side. A visual representation of this continuum is proposed in Figure 1.
Cult
Difficulty In Defining

In discussing “some theoretical and practical perspectives” of cults, Spero (1977) stated, “I think it is amply evident that defining cults – that is, beyond the descriptive level – is difficult at best” (p.331). this is undoubtedly because, as Basham (1979) has noted, “In our society then, what constitutes a cult depends upon the standards of the group by whom that judgment is made” (p.5).

As an evangelical Christian, Basham defined a cult by comparing it with the “genuine” conversion of evangelicals, using theological doctrinal criteria. Other evangelicals have done the same, e.g., Bjornstad (1978), Enroth (1979a, 1979b), Hunt (1980), Larson (1982). On the other hand, when the defining criteria were psychological or sociological, rather than theological, some evangelical Christians have themselves been labeled as cultish, thrown together into research with other groups called cults, or had their own members “deprogrammed” in attempts at promoting a renunciation of their newfound form of Christianity, e.g., Austin (1977), Buckley (1976), Conway and Diegelman (1978), harder, Richardson and Simmonds (1972), Richardson, Stewart and Simmonds (1978). Two evangelicals who mention this problem are Basham (1979) and Bjornstad (1978).

Both those within the anti-cult movement and those who warn against it admit that history has shown many now well-established religions to have been previously given the label of cult – most frequently cited examples being Roman Catholicism, Judaism, and Mormonism (Blackwell, Note 1; Buckley and Galanter, 1979; Hargrove, 1980; Isser and Schwartz, 1980; Robbins and Anthony, 1978, 1979; Sargant, 1957; Schwartz and Isser, 1979).

Accordingly, Hopkins (1978) has differentiated a cult from a religion by “how well-recognized and how well-established” (p.19) the particular group is; and Conway and Siegelman (1978) have stated that “the line between a cult and a legitimate religion in America today…cannot be categorically drawn” (p.46). Jack Buckley (1976) has said it this way, “one man’s cult is another’s orthodoxy” (p.30).

Beckford (1979) has apparently agreed when he depicted “the distinction between ‘real’ and ‘cultic’ religion as being part of a wider ACM political strategy to preserve the view that cult members are unwilling victims of external agencies beyond their control” (p.176). For this reason, some writers have preferred to use other less provocative terms, such as “new religious movement” or “new religion,” e.g., Beckford (1979, 1981), Blackwell (Note 1), Glock and Bellah (1976), Levine and Slater (1976), Robbins and Anthony (1978), Shupe and Bromley (1980).
Toward a Descriptive Definition

Another proposition of this paper is that to properly deal with cults, it must be recognized, as Catholic Father Kent Burtner does, that “we’re dealing with something here that is really a psychological question, not a theological question. And we have to really focus on the psychology of how to help people who are in that state – the practical point of view” (1980, t.3.21). In his Th.M. thesis for Dallas Theological Seminary, evangelical James Roche, Jr. agreed: “the battle is not theological in nature, but psychological” (p.14).

While Burtner was speaking to the problem of helping those already in a cult and to the rehabilitation counseling for those who have already left, Roche spoke primarily to the “preventive counseling” of those vulnerable youth not yet caught up in a cult. He concluded, “At this point in the study, it has hopefully been proven that the main appeal of the cults is not theological, but emotional. The primary thrust of the cults in their proselytism is toward the emotional layer of an individual; sometimes long before any cognitive challenging is done” (p.44). This is echoed by secular mental health researchers Levine and Salter (1976) in their study of 106 members of nine fringe religious groups: “Clearly spiritual, transcendental, or mystical rationales for joining their particular cult were less frequently offered than the intrapsychic and the interpersonal, [in] 80%” (pp.413-414). Consequently, Roche’s primary recommendation for preventive counseling of vulnerable youth was “for the older Christian to first stabilize the young Christian psychologically in reference to their own felt need…The priority of meeting the felt need must be above the theology, although theology should be used in meeting that need if possible” (pp.50, 51).

Review of the literature.

The question then becomes “Can a cult be defined in such a way as to differentiate it from “legitimate” or orthodox religions without relying on theological dogma? Earlier, Spero (1977) was quoted as suggesting that this problem “is difficult at best,” that is “beyond the descriptive level.” He then proceeded to describe several elements common to cults that might be used. These were predominately elements that described the recruitment and conversion tactics used by these groups and the resulting changes which these tactics brought about in the individual recruit/convert.

Eight other recent sources specifically provided lists of major descriptive elements which usually characterize those groups labeled cults (Burtner, 1980; Carr, 1981; CFF Newsletter, February 1981; Haack, 1978; Levine, 1979; Schwartz and Kaslow, 1979; Singer, 1978; University Religious Council, Note 2). However, it should be clarified that these lists are of cult characteristics that are specifically addressed as “destructive” or “extremist.” For example, CFF’s list (1981) was that of the “Characteristics of a Destructive Cult” (p.11) and Enroth’s brainwashing referred to conversion tactics used by “the extremist cults.” Furthermore, Levine (1979), as well as the American Family Foundation (AFF) work by Clark, Langone, Schecter and Daly (1981) has stated that not all cults are necessarily destructive. Unfortunately, clarification of the distinction between cult and extremist cult is not made in the reviewed lists, and can, therefore, only be speculated upon.

The eight lists of cult characteristics may be broken down into two main groups, or clusters, of characteristics. Perhaps this analysis may provide a cult definition that allows for differentiation of cult from extremist cult, as well as from noncultic religions.

First, all eight pointed to the presence of an authoritarian leader (usually the founder of the group, usually still living, often claiming divine inspiration), who promotes (or demands) unquestioning, unconditional, total/absolute obedience, submission, and loyalty to him, his group, and his determined rigid, exclusive; (i.e., salvation comes only from him) system of beliefs and interpersonal behavior requirements. A cult, therefore, is first very much an ultra-authoritarian closed system which is run by a leader with absolute authority who requires absolute submission from his followers.

Spero (1980) has quoted Maurice Friedman (1976, pp. 23-24) as distinguishing “between cults and mature religion in that cults are a ‘community of the like-minded,’ where symbiotic-like togetherness prevails, as opposed to the ‘community of otherness,’ where there is a genuine concern for others, grounded in mature object relationships” (pp.164-165). In cults, there is no autonomy of self, only conformity to others’ expectations. Ultra-authoritarianism, exclusivity, and closed system boundaries, therefore, are the earmarks of a cult.

The aftermath of the Korean War, the values vacuum left by the “liberating” Sixties, and the resulting upsurge of Eastern religious thought in the United States, however, brought the birth and growth of a particular type of cult even more extreme in its exclusivity. Ultraauthoritarianism became totalitarianism when the new extremist cults combined the exclusive closed system with the deceptive use of brainwashing tactics in the recruitment, conversion, and retention of members. Of the eight researchers who provide lists of cult characteristics, six note the presence of this cluster of features (Burtner; Carr; CFF; Schqartz and Kaslow; Spero; and University Religious Council).

Together these six allege that such tactics actively promote the severing of ties with family and the outside world by emphasizing the isolation and total environmental control of the convert by the group (cf. Ash, 1983, for many other supporting references).

One specific, very common minor cult characteristic pointed out by all but two of the above sources (exceptions not mentioning it being Levine and Haack) was that of exploitation of members via personal sacrifice of possessions, money, and time in demeaning and/or physically debilitating work. The most frequent example given was that of fund-raising activities, which also served as a means of deceptively exploiting nonmembers.

Schwartz and kaslow (1979) have, therefore, differentiated (extremist) cults from other “close knit, ethnocentric religious groups such as the Mormons, Amish and Orthodox Jews” by certain “key fundamental differences [which] include that the family as a whole is encouraged to be active in the church and to be concerned for one another, that the deprivation of adequate nutrition, sleep and health care are not sanctioned, that one is expected to have personal belongings and a family place of residence, and that privacy is accorded to all” (p.20).

Of the eight who have listed cult characteristics, five of them (Burtner; Carr; CFF; Schwartz and Kaslow; and Singer) have pointed out (in their lists or elsewhere) the key role which deception plays in the recruitment and conversion tactics of these cultic groups. They are joined in their assertion by many others: Basham (1979); Clark (1979b); Clark, Langone, Schecter and Daly (1981); Conway and Siegelman (1978); Elkins (1980) Enroth (1977a, 1979); Gillespie (1980); Hultquist (1977); Hunt (1980); Isser and Schwartz (1980); Levine (1980); MacCollam (1979); Roche (1979); Rudin and Rudin (1980); Spero (1977); Swope (1980); Thomas (1979). It is appropriate to question, therefore, why Shupe and Bromley (1980) were so persistent in their use of the metaphor of “possession” even though “deception” referred to much more frequently.

Spero (1977) commented pointedly regarding the practice of deception:

It is the manner by which such a way of life is cultivated and reinforced, however, which brings to the fore the more disagreeable aspects of cults. In a sense, one might cite the lack of candidates’ awareness of these occlusive methods of indoctrination – lack of awareness specifically promoted by cult leaders – as one characteristic which differentiates cult-type “religious” commitment from true religious belief. And though there are blind believers in most authentic religions, such a belief is neither the most desirable level of commitment, nor is it purposely reinforced. (p.331-332).

Schwartz and Zemel (1980) have declared that these cult practices should not be compared to “entering a Catholic religious order or a college fraternity” because in those cases “a novice enters an order willingly and fully informed, not only as to the beliefs of the order, but as to what she might expect in the way of practices” (p.306).

Speaking from more of an egalitarian point of view at this point, Beckford (1979) has nonetheless, quite perceptively pointed out that, “the anti-cult case rests heavily on claims that recruits are deliberately deceived about the cults’ real aims, that psychological impairment follows from participation, that physical suffering is caused by some cult practices or ways of life, and that members are unfairly deprived of material possessions” (p.176). the context in which he is speaking is that of the alleged ACM view of cult-members as “unwilling victims of external agencies beyond their control” (p.176). Thus, he has confused the metaphors of deception and possession by using the former to attack the latter. Nonetheless, he earlier put this issue into a less political perspective as follows:

The most salient boundary for psychological and sociological studies of cult-members is that which divides the autonomous, free agent from the victim of controlling milieu…The balance may be struck in various ways, and the existence of an ill-defined gray area between the two extremes is recognized. But it is still thought useful to discuss cult-members in terms of such contrastive characteristics as autonomous/controlled, open/closed mind, moderate/fanatical/centered/non-centered life. (p.175).

In summary, this review has pointed out two major characteristics of an extremist cult:
An ultra-authoritarian, even totalitarian, closed system which is run by a leader with absolute authority who requires absolute submission from his followers; and
The deceptive utilization of brainwashing or thought reform tactics in the recruitment, conversion, and retention of members.

While nonextremist cults may potentially be differentiated from noncultic religions by their ultra-authoritarian closed systems which encourage the severing of family ties and promote deep emotional dependency upon the group (or its leaders), extremist cults have the additional distinctiveness of the deceptive use of brainwashing tactics. Thus, ultra-authoritarian becomes totalitarianism, exclusivity becomes even more extreme with closed system boundaries even more rigid, and emotional dependency deepens to a frozen dissociative state where independent critical thinking almost totally vanishes in most group members. In contrast, more mature religions would openly promote critical analysis and independent thinking, both before and after conversion.

Differential definitive criteria.

Following the extremist cult characteristics outlines by the above analysis of the literature (which is supported by a more extensive review of the literature on cult conversion elsewhere – Ash, 1983), a spectrum of definitive criteria for an extremist cult is proposed in order to differentiate cultic from noncultic groups. The intended purpose of such a proposal is to allow for both differentiation and a spectrum of variation of groups. The differentiation is provided first, to be followed by a clarification of its continuum of cultic variation.

Using a format similar to that used by the Diagnostic and Statistical Manual-III of the American Psychiatric Association, any group may be definitively identified as an extremist cult if it demonstrates itself to possess both of the following characteristics:
An ultra-authoritarian (even totalitarian) closed system as evidenced by:
An authoritarian, dictatorial leader(ship) manifested by two of the following three features:
A charismatic leader(ship), usually the living founder, who claims to have direct contact with deity/the supernatural and/or faultless understanding of the divine will (as seen in interpretation of holy scripture, in prophecy, or in receiving revelation directly from deity or even actually claiming to be deity);
Presumption of absolute authority over doctrine and faultless understanding of what is truth;
Presumption of the role of sole judge of members’ behavior (morality and daily living habits);
At least a majority of the individual members evidencing childlike, deep emotional dependency upon (cf. indecisiveness), and passive uncritical receptivity and unquestioning obedience to the group’s leadership, and at least three of the following five group practices promoting such compliance and conformity:
Prohibition of questioning or discussion of critical analysis and independent thinking;
Exaction of strict adherence to a rigid code of ethics, often extreme curtailment of interpersonal relationships, especially with the opposite sex;
Totalistic control over members’ daily lives, especially if a majority of their time is spent in fund-raising, recruiting, or demeaning and/or physically debilitating work;
Exploitation of members’ (and/or their families’) finances and possessions;
Existence of a double standard of ethics, working conditions or requirements, and/or style of living between the leadership and members lower in the hierarchy;
An exclusive, closed “family” system (with rigidity of boundaries) manifested by:
Selective group reinforcement and punishment; i.e., the systematic application of behavioral conditioning techniques (deliberate or otherwise) using the rewards and punishments of peer, or authoritarian, pressure to promote compliance (to closed system practices and doctrines); and
The closed system being seen in at least three of the following six features:
Exclusivity of doctrinal truth and/or salvation, i.e., “only we” can provide it, and leaving the group means losing it;
Most, if not all dogma, presented in absolute, black and white terms;
Pseudo-paranoid, Manichean (“us” versus “them”) critical view of the world and family outside the group;
Contact with family or extra-group individuals strictly limited or controlled;
Individual members rarely left alone, unsupervised by other group members (either peers or those in authority).
The deceptive utilization of brainwashing (or ultra-hypnotic or thought reform) tactics or methods for the induction (deliberate or otherwise), and probable continued maintenance (cf. the above behavioral conditioning aiding this as well), of a dissociative state via both of the following:
Isolation (at least during the induction/conversion process) of the individual recruit or member from his ordinary frame of reference (familiar to unfamiliar persons, surroundings, activities, dress, etc.); and
The utilization of at least three of the following five practices or tactics:
Information control, overload and/or manipulation (e.g., the deliberate withholding of information and utilization of deceptive “salesmanship” techniques designed for recruit enticement; forced listening to a constant barrage of contradictory messages; or provision of a new cult-specific/esoteric language);
Emotional overstimulation and/or manipulation (e.g., “love bombing,” group confessions, or the deliberate playing on members’ feelings of fear, guilt, or shame);
Physical debilitation via sleep deprivation, diet manipulation, and/or fatigue from constant activity.
Continuous utilization of “not thinking” practices such as chanting, Eastern religious “”mind emptying”) types of meditation, or speaking in “tongues”;
Religious mystical ritual (especially “deliverance” ceremonies or initiation rites with the assignment of a new name and identity).

Cult continuum.

While differential criteria of definition were proposed with the express purpose of establishing a line of demarcation between cults and noncultic religious (or psychotherapeutic) groups, a spectrum of variation exists on both sides of this line.

Taking the noncultic side of the line, certain well-accepted institutions may be seen to utilize or promote cultic practices or closed system emphasis. For example, Bussell, in a recent Christianity Today article (1982), points to “five similarities between cults and evangelical churches”; Dallas Theological Seminary professor Litfin (1977) has warned budding evangelical preachers of “The Perils of Persuasive Preaching,” i.e., that which utilizes hypnotic/brainwashing techniques; and Calvin College professor Houskamp, in his Ph.D. dissertation (1976), demonstrated the application of “resocialization” techniques in a psychiatric hospital that contained many of the same elements that exists in classical brainwashing.

Houskamp suggested that all groups that utilize psychological resocialization tactics (whether religions promoting conversion or psychotherapies promoting “mental health”) may be placed on a continuum according to what degree they utilized such tactics. Likewise, in his survey of 668 ex-cultists and relatives, Blackwell (Note 1) concluded that, “Any religion, ‘old’ or ‘new’ that fits the above criteria of thought reform tactics and their results, to the degree that it fits, is to that degree destructive” (p.21). Therefore, the fourth proposition of this paper is that all cults may be placed on a continuum of destructiveness according to what degree they are a closed totalitarian system that utilizes brainwashing or thought reform tactics to acquire and maintain control over their followers. Furthermore, this would apply to all groups, whether they be religious or psychological in nature (Levine, 1979, p.593; cf. EST: Erhard Seminars Training as utilizing cultic dissociation-inducing practices – Brewer, 1975; Garvey in Orlean, 1983, p.2, Glass, Kirsch and Parris, 1977; Kirsch and Glass, 1977).

Just as there exists a continuum in mental health professionals’ attitudes toward the meaning of brainwashing (this paper’s proposition #2), the probability appears high that the actual utilization of brainwashing tactics by various groups also varies along a continuum. Likewise, the more extremist a particular group is toward the deterministic, or closed system, end of the continuum the more destructive it becomes (cf. Blackwell, Note 1).

Movement by a group along the religious group continuum, whether or not a cult, is also quite possible. Therefore, those noncultic groups which begin to utilize more cultic practices (e.g., guilt-inducing sermons, exploitation, or deception) and promote cultic doctrine (e.g., unusually critical view of non-group members, exclusivity and extreme absolutism) are in danger of turning into cults, especially when they are under the guidance of a charismatic and authoritarian leader.

Likewise, it would be conceivable that cults may cross the line to noncult status through the death of a leader or significant changes in the group’s philosophy should these changes result in a decease in cultic practices or rigidity of system boundaries in practice or doctrine. However, the mere death of the founding leader of the group does not necessarily denote such a shift in the cult structure, an apparent recent example being A.C. Bhaktivedanta Swami Prabhupada and the Hare Krishna.

The fifth proposition of this paper flows naturally from the fourth; that is, involvement in an extremist cult, according to the descriptive definition utilized above, does induce psychological impairment in its followers. This occurs because the cult’s destructiveness is in its destruction to the ego (and ego boundaries) of the convert in order to subvert it to the control of the group (cf. Burtner, 1980; Carr, 1981; Enroth, 1977a; Gilmartin in Sage, 1976; Lasch, 1979; MacCollam, 1979; Merrit in “Experts say…,” 1981; Rose, 1979; Schwartz and Isser, 1979; Schwartz and Kaslow, 1979, 1981; Shapiro, 1977; Singer, 1979; Spero, 1977, 1980, 1982; Stoner and Parke, 1977; University Religious Council, Note 2). Therefore, the etiology of the ex-cultists’ clinical picture is assumed to be rooted in the extremist cult conversion process itself.

Earlier, Beckford (1977) was quoted as saying “the anti-cult case rests heavily on claims that recruits are deliberately deceived about the cults’ real aims; that psychological impairment follows participation…” (p.176). although he was being somewhat skeptical at this point about the reality of such impairment, he later pointed to “a boundary between ‘marginal’ and ‘adaptive’ movements’, i.e., cultic vs. noncultic, which “has been drawn on the basis of whether or not cults help members and ex-members to achieve reintegration into mainstream culture and society” (p.177). although further evidence has been demonstrated elsewhere (Ash, 1983) for the psychological destructiveness of cults, it is sufficient to say that the sheer necessity for rehabilitation counseling of ex-cultists alone (cf. Bjornstad, 1978; Blackwell, Note 1; Burtner, 1980; Carr, 1981; Clark, 1978; Clark, et al., 1981; Conway and Seigelman, 1978; Enroth, 1979a; Etemad, 1978; Levine, 1979; MacCollam, 1979; Schwartz and Kaslow, 1979, 1981; Shapiro, 1977; Singer, 1978, 1979; Stoner and Parker, 1977) would appear to confirm the maladaptiveness and corresponding inducement of psychological impairment by these groups called cults.
Summary of Propositions

The five major propositions of this paper are:
Both free will and social/environmental determinism are present in cult conversion, retention of members, and withdrawal from a cult.
Professional mental health attitudes toward cults may be seen to lie on a continuum. This has been depicted in Figure 1. A comparison of propositions #1 and #2 suggests, therefore, that the most balanced view of cult conversion would utilize the metaphor of deception rather that possession, and the most balanced approach to cult withdrawal would be a freely chosen rational reevaluation of life in the cult, rather than forced participation in a coercive form of deprogramming.
The Issue is primarily psychological, not theological, in regard to extremist cult conversion, retention of members, and withdrawal from a cult. A descriptive definition of an extremist cult, then, would include these two major characteristics (with the suggestion that nonextremist cults may still be differentiated from noncultic religions by their having the first of these):
An ultra-authoritarian closed system which is run by a leader with absolute authority who requires absolute submission from his followers; and
The utilization of deceptive brainwashing or thought reform tactics in the recruitment, conversion, and retention of members.
Cults may be seen to lie on a continuum of destructiveness according to the degree to which they utilize brainwashing/thought reform tactics and to the degree their system is closed and totalitarian. Furthermore, groups not defined as extremist cults by these definitive criteria may, nonetheless, be seen to be destructive to the degree they utilize these cultic practices or promote cultic closed system emphasis.
Psychological impairment follows involvement in an extremist cult, necessitating ex-cultist counseling according to the degree of the cult’s destructiveness. Furthermore, the cult conversion process itself appears to be the primary agent in inducing this psychopathology, while the cultic practices and closed system emphases are seen as maintaining it.
Reference Notes
Blackwell, J. 20th century new religions: Help or hindrance – A guide for mainline denominations. An unpublished survey summary, Ohio Conference United Church of Christ, 1980.
University Religious council. Understanding cult involvement. Unpublished pamphlet, University of California, Berkeley, undated.
References

Anthony, D., Robbins, T., & McCarthy, J. Legitimating repression. Society, March/April 1980, pp. 39-42.

APRL (Alliance for the Preservation of Religious Liberty). Deprogramming: Documenting the issue. New York: ACLU (American Civil Liberties Union), 1977.

Ash, S.M. Cult induced psychopathology: A critical review of presuppositions, conversion, clinical picture, and treatment. Unpublished doctoral dissertation, Rosmead School of Psychology, Biola University, 1983.

Austin, R. L. Empirical adequacy of Lofland’s conversion model. Review of Religious Research, 1977, 18(3), 282-287.

Aversa, R. Psychologist deals with cultic “brainwash.” Los Angeles Herald Examiner, September 11, 1976 (reprinted by CFF,Chula Cista, CA).

Basham, D. Cults: Dungeons of deception. New Wine, May 1979, pp. 5-9.

Beahrs, J.O. Unity and multiplicity: Multilevel consciousness of self in hypnosis, psychiatric disorder and mental health. New York: Brunner/Mazel, 1982.

Beckford, J.A. Politics and the anti-cult movement, Annual Review of the Social Sciences of Religion, 1979, 3, 169-190.

Beckford, J.A. A typology of family responses to a new religious movement. Marriage and Family review, 1981, 4(3/4), 41-55.

Bjornstad, J. The deprogramming rehabilitation of modern cult members. Journal of Pastoral Practice, 1978, 2(1), 113-127.

Brewer, M. Erhard Seminars Training: We’re gonna tear you down and put you back together. Psychology Today, August 1975, pp. 35-36, 82; 88-89.

Buckley, J. The doubtful ethics of deprogramming. Eternity, April 1976, p. 30.

Buckley, P., & Galanter, M. Mystical experience, spiritual knowledge, and contemporary ecstatic religion. British Journal of Medical Psychology, 1979, 52(Pt. 3), 281-289.

Burtner, W. K. Coping with cults: How they work in America. Kansas City, Mo.,: National Catholic Reporter Publishing Co., 1980. Cassette tapes [set of 3].

Bussell, H. Beware of cults with their evangelical trappings. Christianity Today, March 5, 1982, pp. 42-43.

Carr, P. Cult involvement: Assessing precipitating psychosocial and environmental variables. Unpublished masters thesis, St. Cloud State University, St. Cloud, MN, 1981.

CFF (Citizens Freedom Foundation). CFF news, February 1, 1981, p. 11.

Clark, J.G. Cults. Journal of the American Medical Association, 1979, 242(3), 279-281.

Clark, J.G. The manipulation of madness. Neue Jugend religionen. M. Muller-Kupers & F. Specht, eds., (Gottingen, 1979), 85-105.

Clark, J.G., Langone, M.D., Schecter, R.E., and Daly, R.C. Destructive cult conversion: Theory, research, and treatment. Weston, MA.: American Family Foundation, 1981.

Conway, F., & Siegelman, J. Snapping: America’s epidemic of sudden personality change. Philadelphia: Lippincott, 1978.

Cults…another opinion. Psychiatric Nursing, March-April 1980, p. 10.

Elkins, C. Heavenly deception. Wheaton, IL.: Tyndale House, 1980.

Enroth, R. Youth, brainwashing, and the extremist cults. Grand Rapids: Zondervan, 1977.

Enroth, R. M. Cult-Countercult: Is either side fighting fair? Eternity, November 1977, pp. 18-22; 32-35.

Enroth, R. The lure of the cults. Chappaqua, N.Y.: Christina Herald Books, 1979.

Enroth, R. M. The power abusers. Eternity, October 1979, pp. 23-27.

Etemad, B. Extrication from cultism. Current Psychiatric Therapies, 1978, 18, 217-223.

Experts say education best way to limit cult influence. CFF News, May 1, 1981, pp. 3-4.

Friedman, M. Aiming at the self: A paradox of encounter and the human potential movement. Journal of Humanistic Psychology, 1976, 16(2), 5-34.

Furlong, F. W. Determination and free will: Review of the literature. American Journal of Psychiatry, 1981, 138(4), 435-439.

Galanter, M., Tabkin, R., Rabkin, J., & Deutsch, A. The Moonies: A psychological study of conversion and membership in a contemporary religious sect. American Journal of Psychiatry, 1979, 136(2), 165-170.

Billespie, T. To the moon and back…An interview with an ex-Moonie. Psychiatric Nursing, March-April 1980, pp. 6-9.

Glass, L. L., Kirsch, M.A., & Parris, F. N. Psychiatric disturbances associated with Erhard Seminars Training: I. A report of cases. American Journal of Psychiatry, 1977, 134(3), 245-247.

Glock, C. Y., & Bellah, Jr. N. The new religious consciousness. Los Angeles: University of California Press, 1976.

Gordon, J. S. The kids and the cults. Children Today, 1977, 6(4), 24-27; 36.

Haack, F. W. New youth religions, psychomutation and technological civilization. International Review of Missions, 1978, 67(268), 436-447.

Harder, M. W., Richardson, J. T., & Simmonds, R. B. Jesus people. Psychology Today, December 1972, pp.45-50; 110; 112-113.

Hargrove, B. Evil eyes and religious choices. Society, March/April 1980, pp. 20-24.

Harrison, B. G. the struggle for Wendy Helander. McCalls, October 1979, pp. 87-94.

Heller, R. Deprogramming for do-it-yourselfers: A cure for the common cult. Medina, OH.: The Gentle Press, 1981.

Hopkins, R. P. The hospital viewpoint: Mental illness or social maladjustment? Journal of the National Association of Private Psychiatric Hospitals, 1978, 9(4), 19-21.

Houskamp, R. E. Social construction of reality and the process of resocialization: A comparative analysis. Unpublished doctoral dissertation, Tulane University, 1976.

Hultquist, L. They followed the piper. Plainfield, N.J.: Logos International, 1977.

Hunt, D. The cult explosion: An expose of today’s cults and why they prosper. Irvine, CA.: Harvest House, 1980.

Isser, N., & Schwartz, L. L. Community responses to the proselytization of Jews. Journal of Jewish Communal Service, 1980, 57(1), 63-72.

Kaslow, F., & Sussman, M.B. (Eds.) Cults and the family, New York: Haworth Press, 1982.

Kirsch, M. A., & Glass, L. L. Psychiatric disturbances associated with Erhard Seminars Training: I. Additional cases and theoretical considerations. American Journal of Psychiatry, 1977, 134(11), 1254-1258.

Larson, B. Larson’s book of cults. Wheaton, IL., Tyndale House, 1982.

Lasch, C. The culture of narcissism: American life in an age of diminishing expectations. New York: Norton, 1979.

Levine, E. M. Deprogramming without tears. Society, march/April 1980, pp. 34-38.

Levine, S. V. Role of psychiatry in the phenomenon of cults. Canadian Journal of Psychiatry, 1979, 24(7), 593-603.

Levine, S. V., & Salter, N. E. Youth and contemporary religious movements: Psychosocial findings. Canadian Psychiatric Association Journal, 1976, 21(6), 411-420.

Levitt, K. (with C. Rosen). Kidnapped for my faith. Van Nuys, CA.: Bible Voice, 1978.

Lemoult, J. E. Deprogramming members of religious sects. Fordham Law Review, 1978, 46, 599-640.

Litfin, A. D. The perils of persuasive preaching. Christianity Today, February 4, 1977, pp. 14-17.

MacCollam, J. A. Carnival of souls: Religious cults and young people. New York: Seabury Press, 1979.

Maleson, F. G. Dilemmas in the evaluation and management of religious cultists. American Journal of Psychiatry, 1981, 138(7), 925-929.

Orlean, S. C. EST puts itself in charge of tomorrow. Boston Phoenix, January 18, 1983. (CFF [Citizens Freedom Foundation] News, April, 1983, pp. 1-6)

Pattison, E. M. what about brainwashing, conversion and deprogramming? Eternity, April 1976, p. 27.

Rice, R. Messiah from Korea: Honor thy father Moon. Psychology Today, January 1976, pp. 36; 39-40, 42; 45; 47.

Richardson, J. T., Steward, M. W., & Simmonds, R. B. Conversion to fundamentalism. Society, May/June 1978, pp. 46-52.

Richardson, J. T. Brainwashing. Society, March/April 1980, P. 43.

Robbins, T., & Anthony, D. New Religions, families, and brainwashing. Society, May/June 1978, pp. 77-83.

Robbins, T., & Anthony, D. Cults, brainwashing, and countersubversion. Annals of the American Academy of Political and Social Sciences, November 1979, pp. 78-90.

Roche, J. Jr. Preventive counseling proposals concerning the proselytizing techniques of major cults on youn Christians. Unpublished masters thesis, Dallas Theological Seminary, Dallas, TX., 1979.

Rose, S. Jesus v. Jim Jones. New York: Pilgrim, 1979.

Ross, J. C. Commentary: The intellectuals against the anti-cult movement. CFF [Citizens Freedom Foundation] News, November, 1982, pp. 1-3.

Sage, W. The war on the cults. Human Behavior, October, 1976, pp. 40-49.

Sargant, W. Battle for the mind: A physiology of conversion and brainwashing. New York: Harper & Row, 1957.

Schwartz, L. L., & Isser, N. Psychohistorical perceptions of involuntary conversion. Adolescence, 1979, 14(54), 351-360.

Schwartz, L. L., & Kaslow, F. W. Religious cults, the individual and the family. Journal of Marital and Family Therapy, 1979, 5(2), 16-26.

Schwartz, L. L., & Zemel, J. L. Religious cults: Family concerns and the law. Journal of Marital and Family Therapy, 1980, 6(7), 301-308.

Shapiro, E. Destructive cultism. American Family Physician, 1977, 15(2), 80-83.

Shupe, A. D., & Bromley, D. G. Witches, moonies, and evil. Society, May/June 1978, pp. 75-76.

Shupe, A. D., & Bromley, D. G. The new vigilantes: Deprogrammers, anti-cultists, and the new religions. Beverly Hills, CA.: Sage Publications, 1980.

Shupe, A. D., Spielman, R., & Stigall, S. Cults of anticultism. Society, March/April 1980, pp. 43-46.

Singer, M. T. Therapy with ex-cult members. Journal of the National Association of Private Psychiatric Hospitals, 1978, 9(4), 14-18.

Singer, M. T. Coming out of the cults, Psychology Today, January 1979, pp. 72; 75-76; 79-80; 82.

Spero, M. H. Cults: Some theoretical and practical perspectives. Journal of Jewish Communal Service, 1977, 53(4), 330-338.

Spero, M. H. the stimulus value of religion to cultic penitent personality types. Journal of Psychology and Judaism, 1980, 4(3), 161-170.

Spero, M. H. Psychotherapeutic processes with religious cult devotees. Journal of Nervous and mental Disease, 1982, 170(6), 332-344.

Stoner, C., & Parke, J. A. All gods children: The cult experience – salvation or slavery? Radnor, PA.: Chilton, 1977.

Swope, C. W. Kids and cults: Who joins and why. Media & Methods, May/June 1980, pp. 18-21; 49.

Thomas, P. Targets of the cults. Human Behavior, 1979, 8, 58-59.

Verdier, P. A. Brainwashing and the cults: An expose on capturing the human mind. No. Hollywood, CA: Wilshire, 1977.

West, W. P. I know deprogramming works. Eternity, September, 1976, pp. 75-76.

Yamamoto, J. I. The puppet master: An inquiry into Sun Myung Moon and the Unification Church. Downers Grove, IL.: Intervarsity, 1977.

Zerin, M. Untitled book review of Cults and the family (Kaslow & Sussman, 1982). Cultic Studies Newsletter, 1982, 1(1), 7-9.

Stephen M. Ash, Psy.D. is a licensed professional counselor in private practice in Dallas, Texas. In 1983 Dr. Ash completed a dissertation on cults, Cult Induced Psychopathology: A Critical Review of Presuppositions, Conversion, Clinical Picture, and Treatment, for the Rosemead School of Psychology, Biola University.