Roderick Dubrow-Marshall, Ph.D.
University of Central Lancashire
Influence amongst human beings is ubiquitous: It is literally everywhere—in the words we use, the things we see, even the air we breathe. Some of this influence is profoundly beneficial—the influence of education, of parents, or of loved ones. Other forms of influence can be ethically and morally questionable. At the other end of this “continuum” are types of influence that are terribly harmful: The dead of the Japanese, Madrid, and London transport systems are vivid casualties of undue influence at its extreme. Empirical evidence will be presented in this paper for the mindset that is established in extremist groups, with implications for how more benign forms of influence can be better protected, monitored, and promoted.
This lecture attempts to set out the continuum of influences from the good, to the dubious, and to the clearly harmful. In doing this, the blurring of such distinctions will become obvious, and an appeal for reason in our analyses and responses is called for. While I have dedicated much of my academic work over the past decade to helping those who have suffered at the hands of abusive groups or individuals, I simultaneously cling to and champion the enlightenment ethic of scientific reason over superficial prejudice. To quote the late president of the International Cultic Studies Association (formerly the American Family Foundation), Herb Rosedale, Esq., “Judgements should rest on careful analyses of structure and behaviour within a specific context, rather than superficial classification.”
The task in this lecture then is to present this range of influences and to present research that has elucidated some of what happens when influence is decidedly harmful. My work therefore has tended to focus on what are commonly referred to as sects or cults—terms that I should admit at the outset are not value neutral. In fact, I am committed to a tradition within social psychology, that of social constructionism and discursive psychology, that is clear that all words mean something, embody something, and do something. In this case, the words are the embodiment of another term, that of undue influence (cf. Billig, 1986; Edwards & Potter, 1992). Some may label a group as a cult or sect because it has arguably done something injurious to some people, and because people have seen loved ones harmed, or have even lost family members altogether, as a consequence of the group’s actions or inactions. And they want to do something in response to that. I want to do something about that, too! So while I generally eschew superficially useful terms, I also recognise the importance of using a shorthand term to get us started on the road of communication—a short and simple word for what is a complex category of groups and behaviours. So I will use the word cult with all those caveats.
I am often asked—and you can ask me later—What is a cult? or Is this group or that group a cult? I tend away from such superficial labels of groups or individuals, which can do more harm than good to all involved. Sociologists in this field may define a cult or sect as a group on the fringe of society, ideologically driven and occupying a polarized belief position; this group may be a new religion or new expression of some other kind of ideology, such as politics. A cult also typically may be hierarchically organised, with a powerful leader figure or “guru” who is often the perfect embodiment of the belief system, and sometimes even is self-defined as a god figure.
Sometimes an illustration or example best defines this phenomenon, and here I will use the following quote:
When you meet the friendliest people you have ever known, who introduce you to the most loving group of people you’ve ever encountered, and you find the leader to be the most inspired, caring and compassionate and understanding person you’ve ever met, and then you learn that the cause of the group is something you never dared hope could be accomplished, and all of this sounds too good to be true—it probably is too good to be true! Don’t give up your education, your hopes, and ambitions to follow a rainbow.
This is a quote from Jeannie Mills, an ex-member of the Peoples Temple—she was later found murdered (this quote is also included in Singer and Lalich’s seminal book, Cults in Our Midst ). This is an example of a dangerous and extremist cult that, under the leadership of Jim Jones, relocated to Guyana and set up the Peoples Temple—a group with a curious mix of religious and political ideology at its core.
So this quote painfully exemplifies how, as a psychologist and social psychologist, I am much more concerned with what happens psychologically to and in the person in such group settings. I am very concerned also with whether a group appears to cause harm to individuals as a consequence of the way the group operates (as well as the harm the group can cause to people outside). One could say these are the psychological characteristics of cults or sects, as opposed to the sociological definition of a cult or sect.
This focus is perhaps less newsworthy than that of calling “x” group a dangerous cult that should be banned or broken up. But I am clear that dangerous behaviour by any group or individual should be banned or stopped in whatever context. There should be no religious or political exemption for such harmful activity. Indeed, the most recent atrocities by extremist terrorist groups, whether in Tokyo, New York, Pennsylvania, Madrid, London, or Mumbai, show that religious zealotry can be no cloak of protection as the bullets fly and bombs explode. Throughout history—and these more recent terrorist attacks are part of a much longer history of such violence—extremist religious or political ideology is often cited as the underlying reason or justification for such inhuman attacks. Psychologically, how can it be that apparently loving family members can be so quickly transformed into apparently murderous psychopaths? What is going on? These are questions, I know, that concern many policy makers, law enforcers and lawyers, and psychologists, and that actually concern all of us as we look at the aftermath of the carnage and murders in India.
My thesis is that to help explain the most extreme examples, we must look at the psychological processes of influence across a variety of settings to better understand influence overall as a phenomenon. I am concerned with the effect of influence on members of extremist groups, both to understand how we can best intervene to reduce undue influence and harm, and because I believe this is the best way to prevent extremist groups or cults from pointing their psychopathology outward toward the rest of the world. My approach uses as a starting point that which is perhaps a little akin to a health and safety inspectorate: Is there harm taking place, to whom, and why—in terms of the process, cause, and effects; and, if so, what can we do to reduce it or stop it?
If any organisation fails to act in response to such an intervention, then it should face the consequences in legal terms. So my starting point, in what is clearly a form of “action research,” is one of protecting the human rights of our citizens who are members of groups or organisations that have typically fallen outside the reach of the health and safety inspectors or the employment tribunals, so that these members don’t do harm to themselves or to others. These are groups that the law has, to varying degrees, ignored or held on a long leash, and they include churches, new religions, political groups, psychotherapy groups, training programmes, new-age groups, and pyramid selling schemes. A key aim is to answer the question “How do we stop suicide bombers or terrorists from developing within the roots of these types of group dynamics?” as opposed to just cutting off the abominable heads of the extremist once it has flowered and already done its horrific damage.
But first let’s take a step back and examine how influence transcends definitions of groups or types of groups, and is all around us. For this is how we can, I believe, come to better understand how “undue influence” in groups that we might call sects, or cults, or simply extremist, is distinct psychologically, both in manner and form.
Without influence, it can be said that we wouldn’t exist as we appear to exist. From the earliest moment of our existence, we are influenced—through school, university, relationships, and of course by the media. This nexus of relationships defines us—from an interpolation that we are a person with a name, through the notion, to quote R. D. Laing, that we are “skin-encapsulated egos.” Much of this influence is benign and beneficial—it’s why we are all here, in fact. Some of the influence I have mentioned is perhaps less clear cut, as are other forms of propaganda, including advertising. Then we have the examples that are perhaps even more questionable: the groups that are commonly labelled as cults, such as that of Sai Baba, the self-proclaimed prophet whose child abuse is well documented, as is his organisation’s intrusion into the UK school system. And there is Tom Cruise, who is in a group that some believe is a cult—namely, the Church of Scientology, an organisation that regularly falls back on law when it is negatively labelled. So let me say this clearly: Scientology is not a dangerous cult that causes harm to some of its members and that breaks up families!
I would like to focus on a chart from a wonderful book that sadly is out of print, written by one of the past century’s leading psychologists on cults, the late Margaret Thaler Singer, and a good colleague of mine in the ICSA community, Janja Lalich. The book, Cults in our Midst (1995), is a straightforward but enlightening account of how cults operate all around us and in a variety of different ways.
The chart sets out a Continuum of Influence and Persuasion, which moves from more benign forms of influence, such as education and advertising, through propaganda and indoctrination—those areas that are seen as more questionable, and then finally to “thought reform,” the process that has been shown to take place in undue-influence settings—i.e., in cults or sects. The process of thought reform is one I will return to shortly; it is a concept the psychiatrist Robert J. Lifton pioneered in his seminal work (1961), Thought Reform and the Psychology of Totalism, about the Chinese thought-reform movement, and later in his analysis of the Aum Shinrikyo sect and its sarin gas attacks on the Tokyo underground in the 1990s.
As one moves along this continuum, the power dynamic between the influencer and the influenced becomes noticeably more pronounced. It is widely acknowledged that some of the most effective education not only conveys information from the educator to the students, but also seeks to actively empower the students to learn for themselves, to challenge and create new knowledge as part of the process. Advertising, in contrast, whose main aim is to convince us to buy something or participate in something, is less open and honest; here, the power relation becomes a little more pronounced, as does the level of deception.
Proceeding along the continuum, propaganda, whether from government or media, is more pronounced in wanting to exaggerate its argument and deceive the recipients of the message. Whether it is “change we can believe in” or “our broken society,” politicians seek to sway and influence on a large scale—the detail of truth is for individuals to pick out. But most of the time propaganda can take people along in the swell of the message, at least for a while. Indoctrination essentially takes propaganda a stage further, in ensuring that such belief systems become fully inculcated within the organisation or society, with members and other individuals as active agents who work consciously on behalf of the belief system. Notably, at this point on the continuum, there is still awareness of differences—even a degree of tolerance, perhaps—in contrast to the final category of thought reform.
So how is thought reform defined? Margaret Singer and Janja Lalich have described the “tactics” of thought reform as falling into three categories, which “are organised to”:
“Destabilise a person’s sense of self.”
This means that the pre-existing identity of the person who joins the group is either broken up or subsumed by the more powerful cult identity. This identity is obviously very different for those who are raised in cults—what is referred to in the field as the “second generation”—who do not have a fully formed pre-cult identity.
“Get the person to drastically reinterpret his or her life’s history and radically alter his or her worldview and accept a new version of reality and causality.”
A new self identity is forged, which is based on a clear set of beliefs—an ideology, which is usually fairly radical within the relevant frame of reference (religious beliefs, political beliefs, etc.), and which involves dispensing with or at least dampening out any discordant previously held beliefs. A clear “us and them” dichotomy is typically reported to have developed, with cult members clearly seeing themselves and their group as in an elevated position with regard to the rest of society.
“Develop in the person a dependence on the organisation, and thereby turn the person into a deployable agent of the organisation.”
I will return to how this dependency is forged psychologically, and how the normative actions of cult members, whether they are selling a paper or setting off bombs, are reinforcing of the powerful group identity that has enveloped them.
Singer, in describing these three aims of thought reform, draws on and cites the work of Robert Lifton, whose eight themes of thought reform describe how members of cults become unduly influenced and effectively trapped psychologically. These ways are reminiscent of the work of other psychiatrists, such as Gregory Bateson (1952), whose concept of “double-bind” was applied usefully by Laing and others to explain the psychopathology involved in schizophrenia as one that (in Laing’s case family) revolves around a nexus of influences. To be clear, then, a thought-reform group or cult is a group that in essence displays symptomology akin to a group form of psychopathology; it is literally a group that is sick in its processes, power dynamics, and the effect it has on some of its members. This is not to say that any or all members of cults necessarily have any form of psychopathology, although many ex-members have been tested to show higher than normal levels of depression, anxiety, dissociation, and personality changes, including sometimes those that are reminiscent of psychotic conditions. Individual differences abound, as we will see later on with regard to the research that has been undertaken in this area; but for now, Lifton’s themes are a useful description of what takes place in undue influence settings.
Milieu control is the total control of communication in the group; this can include spying by others, often leading members of the group, and the effective control of the social environment by the group leader or leaders. The role of the group leader is particularly important as the unique and sacred word of, usually, a god—if the group leader himself is not defined as some type of a god. This theme describes a hierarchical model typical of groups that exhibit high levels of control and require complete obedience from their members or followers, including, in this case, a strict adherence to daily rituals that effectively control or frame the entire waking day and limit the amount of sleeping time or rest that persons can have. Milieu control was clearly evident in groups such as Jonestown, but also in cults such as the Heaven’s Gate group, whose community was cut off from the world around them.
The spiritual founder or guru is one step removed and thus harder to challenge, and this is part of the overall milieu control. If members doubt the words of the leader, then it is because they are not working hard enough for the group, or are not spiritually connected enough. This has been clearly reported in groups such as the Sai Baba organisation, in which Sai Baba purports to have god-like powers and is hence removed from situations in which his godlike omnipotence can be challenged. However, the milieu control is essentially circular and leads to a pattern of behavioural response that is akin to Bateson’s double-bind position, as already mentioned. If members feel unhappy, it is because of their own failings, not the group. If they disagree with the group, it is because they have not studied hard enough. The more someone tries to pull away because they feel trapped, the more they are pulled back in and the more trapped they become—a classic exemplar of double-bind.
Loading the language, Lifton’s second theme, describes how the discourse of the group is allied to the milieu control that has already been identified as a way of binding a person’s identity to the group. A “group speak” is learned and replaces previously used language. In this way, an ideologically charged form of language is honed that reinforces the dichotomy between the group and the rest of the world. Negative terms—often, in religious groups, terms that describe the devil—are applied to nongroup members. This language, repeated and paraphrased on many occasions in the group (often in lectures or classes), serves to constrict beliefs further and reinforce the critical belief that the only true path is that of the group and of its spiritual founders. Here again the double-bind is present: If a member disagrees, then that is potentially the path of the devil; and of course this must be avoided at all costs (hence, a circular reinforcing of the group position).
The third theme from Lifton is the demand for purity that cults require from their members; this is linked closely with the practice of confession, his fourth theme, whereby failings and faults are confessed to the guru or leader, who then sets the person on the path to redemption—the path of the group and its founders. For example, members are expected to judge themselves against the all-or-nothing standard of the group (again, the us-versus-them standard). This inducing of guilt and shame is therefore closely linked to the demand for purity, wherein only the pure and internal code of the group can lead individuals away from these feelings. Again, they are trapped in a double-bind in which, if they resist, then they are judged as having serious problems that will get worse (including worse feelings of guilt and shame) without the teachings of the group; thus, there is a circular compunction to continue with the teachings.
Lifton’s fifth theme is termed mystical manipulation. The manipulation is termed “mystic” because it usually is not based on natural scientific modes of investigation or evidence, but instead appears in ways that either are not fully understood, or perhaps the guru has the nearest insight of anyone. Indeed, revelatory insights appear almost from nowhere, apparently spontaneously; but they are, of course, carefully preplanned by the group leadership. Part of the overall milieu control that involves the group and its leaders is to show members how new feelings, including redemption, can arise as a consequence of the setting and the spiritual teachings. In other words, to be here in the group is to experience this new freedom and ability to rise above where others cannot go; in this way the us-versus-them dichotomy is once again reinforced.
Closely related to mystical manipulation is the sixth theme of sacred science, wherein the group leader invests in himself the highest level of sacred knowledge and insight, which is both mystic and claims to have unique insights into the science of the world and the universe. How can a group member, already dependent on the group and its leader for redemption, dare to challenge the bearer of such awesome insights?
The final two themes of undue influence and control that Lifton sets out he has termed doctrine over the person and dispensing of existence. The group effectively dispenses with the person’s earlier existence, career, or hobbies and instead provides a complete and alternate destiny that is absolute and unwavering. In this way, the person is also submerged by the doctrine, not only through the confession and admission of past sins that only the teachings of the guru and group can put right, but also through the need to commit absolutely to beliefs and teachings of the group that are described repeatedly as the only path for redemption and protection.
The most painful examples illustrate this process the best: The Heaven’s Gate cult eventually took part in a group suicide, believing, passionately, that when they died they would be transported to the space ship that was arriving for them as their leader foretold. That leaders of the group “helped” children and other members to die also illustrates the complex pattern of influence in such a claustrophobic and highly charged situation—how much dissonance really existed? How much additional “persuasion” did members need in order to take their own lives? How many would have lived had they been given anything of a choice? As with the Branch Davidians, there is evidence, as probably there is in all cults, that there were varying levels of ideological and group commitment amongst members. But the processes of thought reform can reward the doubters, sometimes with an added sense of urgency to commit more to what they fear may be their only chance and path to redemption. Many researchers and authors in this field have described well how the break point comes for individuals; one of the clearest descriptions is cult expert and counsellor Steve Hassan’s account of how his own involvement in the Unification Church or Moonies was broken by a car accident that gave him an opportunity to think outside of the constraints of the immediate group milieu and influence.
Many case studies and pieces of research have both demonstrated and operationalised the Lifton themes. Indeed, the Group Psychological Abuse Scale (Grice, Dole, Chambers and Langone 1994) has been validated across a number of settings as an objective measure of the level of undue influence that Lifton’s themes define.
From Lifton’s Themes to Related Psychological Processes
This lecture turns now to how these themes of thought reform, of the social psychological processes in a cultic group setting, can themselves be seen to be related to the psychological processes in the person in that setting. If, as was the stated intention earlier, we are to understand what happens to and in the person in extremist groups—including why someone joins at all, then we need to understand phenomenologically what happens within the person’s psychological make up. In this way, I tend to view the Lifton themes as an articulation of the social forces and influences that have an impact on the person; but they are not (and ontologically cannot simultaneously be) an articulation of the psychological processes within the person.
In this way, we need to appreciate the vast distance that social psychology travelled through the second half of the 20th century as it embraced what has been referred to as the European School of Social Psychology. Amongst its leading figures were psychologists such as Solomon Asch, Jerome Bruner, Serge Moscovici, Michael Billig, Gustav Jahoda—and most notably for tonight’s lecture, the work of Henri Tajfel, who, with his colleague John Turner, created the whole field of work around Social Identity Theory and Self-Categorisation Theory in the 1960s, 1970s, and 1980s at Bristol University, and then far beyond. Their shared initial commitment—and that of many others—was to understand why it was that some people had embraced the extremism of Naziism during the 1930s and 1940s, and what causes ordinary people—not psychopaths—to believe in stereotypes about certain social groups, and to act according to those stereotypes with damaging prejudicial actions and judgements.
In a more modest way than these leading figures of my discipline, these have always been my concerns, as well. My own doctoral work was on the subject of the psychological processes involved in social category salience—the psychological process at the heart of stereotyping and prejudice. However, I am arguing this evening, as I have done extensively elsewhere, that this important body of work can also give us critical insights into why and how individuals join and become active agents for the type of extremist cults that Lifton and others have so vividly described. Wherever you find a cult, after all, you find an awful lot of stereotyping and prejudice from the members within it. The “us” and “them” labels that Lifton outlines is clearly part of the stereotypical or prototypical group beliefs and norms, but it also is how the group is self-categorised, as we will see.
So the breakthrough of Social Identity Theory was to demonstrate that the social world does not just impact upon us, as was advanced in traditional social psychology (as in the work of Floyd Allport, for example). This has sometimes been labelled as the “billiard-ball” approach to social interaction—the view that the social will impact on the individual, and the individual may be moved or influenced in a particular direction; but overall the individual is unchanged and can move on, psychology all intact.
Instead, the key advance was to show how the social realm is part of the individual—it exists inside of all of us, as part of our cognitive system of categories. Tajfel’s work is clear in showing that social or group identity, then, is a regular part of a person’s self-concept or self-identity (Tajfel 1978). It has been demonstrated in countless experiments how categorisation, along group lines, can take place extraordinarily quickly and without any normative content, including without any interpersonal attraction. So Billig and Tajfel (1974) showed how splitting people who didn’t know each other at all into two groups, based on a less-than-relevant categorisation—preference for Klee or Kandinsky paintings—was enough to lead to group identification and prejudice in what have become famously known as the
“minimal group experiments.”
Of course, most groups are not minimal in this way—cults least of all; and that interpersonal attraction is not necessary for group identification has been vividly demonstrated in other experiments in this field. The formation of the “psychological group,” then, is a core concept from Social Identity Theory, which John Turner took and advanced further in the Self-Categorisation Theory (1987).
This body of theory and research demonstrates clearly how the same processes of categorisation that allow us to recognise classes of objects—chairs, lecturns, apples, spiders—allow us to also recognise social groups. It is this process of deductive reasoning in our perceptions that allows us to use and apply what we already know—often referred to as prototypes in cognitive psychology—to the stimuli we face every day. We do not have to inductively relearn what a chair is every time we see one. In the same way, students do not have to relearn what their teacher or lecturer is there for each time they walk into the classroom in the morning. Social categorisation allows for cognitive short-cutting, and it is also what explains stereotyping—e.g., there are group norms that a group is best known for. A stereotype may have a “kernel of truth” to it (and this varies of course); and certain attributes become the group norms, accepted by group members as part of their self-defining group concept, and accepted by others outside. The basis of stereotyping is best expressed by the concept of the metacontrast ratio, which is the ratio of perceived intergroup difference to intragroup similarity. So the more cohesive the group, the more certain of its beliefs its members are, the more they carry out those beliefs as group norms; and the more the group is perceived as different from other groups, the higher the metacontrast ratio. Penelope Oakes refers to this position (as part of the Self-Categorization Theory) as the “separateness and clarity” of the social categorisation. And when one thinks of extremist groups and cults, it is fairly clear that they fit this overall description very easily: a highly cohesive group who see themselves and who are seen by others as very very different from everyone else.
Self-Categorisation Theory draws on the work of Eleanor Rosch to explain the “basic level” categorisations that humans make all the time to allow them to recognise the chair and the table—and all you lost souls out there and those of us in my church who are the chosen ones to save mankind from apocalyptic disaster! Jerome Bruner’s pioneering work on different perceptual categories and their differing levels of cognitive readiness allows us to recognise that, when Tajfel talks of the “emotional and value significance” of a social group to our identity, he is giving the cognitive some normative core. This work also allows us to recognise that, for those in extremist groups, the level of emotional and value significance is far, far higher than most of us can possibly imagine. The best analogy I can think of is to describe the feeling—and most, but not all of us experience this—of the first time you thought you were really falling in love… and then you broke up. That is somewhere close to the emotional and value significance that cults seem to hold for some of their members. The group is literally everything to them; it is their whole life.
Now clinically, one can explain why this might not be psychologically the best or healthiest way to live. But cognitively, too, this appears to do something strange to the normal way in which people are able to move quickly and seamlessly between the different categories that are part of them. OK, some categories may be higher up the perceptual repertoire than others—more “perceptually ready,” to use Bruner’s term; this may also be caused by the available social stimuli—i.e., is the context relevant to the group. So when I am watching my football team Nottingham Forest play, along with the normal feeling of sadness, I am also psychologically less likely to be thinking as a member of my political party, for example.
In cults and extreme groups, the milieu, as Lifton and Singer/Lalich explain, is dominated by the group; it also appears that the group is dominant in terms of perceptual readiness. All of this adds up to a highly salient social psychological category, and one that is apparently dominant over most, and sometimes all others. So a separate model has emerged in my work, which I have labelled as the theory of “totalistic identity,” wherein the cult or extremist group identity appears to be so totally dominant—it is totalistic—that it effectively blocks out the normal cognitive movement between categorisations at different levels.
The usual repertoire of categories—other groups, and other likes and dislikes at the personal level—simply appear to be unavailable or less available. This model resonates with the many personal accounts of former members of cults or extremist groups who recall how they had forgotten their precult identities; and after they left the group, they had to remember and relearn them. It also resonates with the startling differences that family members and friends see in their loved one when he or she is in a cult-like group: All that the person previously identified with—the local golf club, membership in a drama group, or other hobbies—seems to be subsumed, cut off, cognitively and psychologically.
Most of the time, however, these descriptions are not how current members of such groups think or feel. Their complete faith and belief in the group and its leader feels completely normal to them—although this trust can and does wane, and questions do appear for some but not for others.
Explaining Social Influence
The influence process that this theoretical perspective describes is also very different from traditional, more atomistic, billiard-ball accounts of social influence. Again, instead of seeing influence as a process entirely of one person having an impact on another (or a group having an impact on a person), the social-identity tradition, by virtue of understanding how the group is part of the self, sees the group influence as essentially self-referential, as well. So when group members sell their papers, raise money, persuade people to come to their events, sell their houses and give the money to the group, and so on, they do these things because the action reinforces the group identity that has become such an important part of their self-identity. This is what Turner calls “referent informational influence,” and it is a decided break from the dualistic notion in earlier theories, such as that of Deutsch and Gerard (1955), which suggested that a belief either would be internalised and believed in (informational influence) or merely played along with because of external pressures and norms or expectations (normative influence).
Of course, a group member with referent informational influence is under a lot of pressure, but the pressure is essentially self-inflicted because the group norms have been internalised to become part of who the person is—the “psychological group,” to use Turner’s phrase. Every group act or behaviour backs up that big part of you that is the group. In fact, “doing” the group—acting as a prototypical good group member—carries with it benefits in terms of collective self-esteem (the emotional and value significance again), and at the same further reinforces the group identity. This process leaves the person in the position of identifying more with, and being more dependent on, the group than ever, and desiring still further the gratification that comes from being the best that the group is and could possibly be. Once more, this is the circular double-bind.
So when you are in the cult, whom do you let down if you don’t do what the cult leader asks? Not just the cult leader. You also let yourself down because there is no more important exemplar of yourself than the group—including its most powerful embodiment, its leader. Cults therefore appear to demonstrate referent informational influence par excellence—they take it to the infinite degree—and in the case of violent terrorist cults, they do so with the tragic consequences that are inflicted on so many.
Now the image this evokes is itself the stereotype of the zombie brainwashed cult member who cannot tell right from wrong —and sometimes this undoubtedly may be the case. Often, however, ex-members speak of the many struggles they went through inside themselves when they were in the group, to decide what they should do at various moments. But while they remained in the group, the group identity continued to be dominant or relatively dominant. When I speak to families about loved ones who appear lost to them in groups of this kind, it is important to help them recognise that, from time to time, and sometimes quite often, doubts will surface in their loved ones; ideas discordant to the group will emerge; the next group demand might feel like a step too far—all are junctures at which the dominance of the group identity is vulnerable to intrusion, all are opportunities for members’ normal range of identifications and categorisations to kick back in. For many, that floodgate is eventually opened. For others, it appears to be permanently closed; but who knows? There are examples of people walking out of cult-like groups after 50 or 60 years.
Evidence in Support of the Totalistic Identity Theory
So far, I have presented the Totalistic Identity Theory to you as an extension of the Self-Categorisation Theory and as a corollary to the work of Lifton and others, to explain on a cognitive and psychological level the phenomenon of undue influence and of cult identity and influence. In the final section of this lecture, I will briefly turn to some of the evidence that we have so far brought to light in support of this theory. I say we because at this point I must acknowledge and pay tribute to a research team that spans different continents. This research has been done in partnership with many colleagues over the years, most importantly with Dr. Paul Martin, the founding Director of the Wellspring Retreat and Resource Center, Linda Dubrow-Marshall; and Ron Burks; but also Carmen Almendros, Lindsay Orchowski, Peter Malinoski, and Lois Kendall. There are also others.
It is important to acknowledge the very important work at the Wellspring Retreat and Resource Center in Ohio, USA, which is the only residential treatment programme of its kind, dedicated to the recovery of former members of abusive groups, or of cult and abusive relationships with the same qualities. Using the Lifton model as its theoretical basis, Paul Martin and his colleagues have crafted a unique treatment programme that is highly effective at treating a complex pattern of psychopathology in ex-members, which can include a mix of depression, anxiety, dissociation, and other symptoms including aspects of post-traumatic distress. Every ex-member has a unique set of problems—as in all clinical work—but there are also commonalities and patterns. More than eight years ago, Paul Martin and I decided to look in more detail at these patterns to see whether we could discern a model of the psychopathology amongst this population that related to the group experience they had had.
To do this, we concentrated in part on whether the group-focussed treatment programme was efficacious—and all the results consistently show that it is. But at the same time, we used the measure of Group Psychological Abuse (GPA) developed by Chambers and colleagues to gauge the perceived level of former group abuse. Now it is also important to acknowledge that the sample of ex-members that visits Wellspring is self-selecting, and of course they are all ex-members. It has so far proven very difficult in this field—particularly when it comes to psychometric tests—to get measures of psychopathology from current members of extremist groups. Maybe that day will come, but understandably the groups rarely allow researchers anywhere near their members with such instruments!
So this research has this important design characteristic, which always means that we are looking to establish, post-hoc, whether there appears to be a unique pattern of group-based psychopathology for this particular sample. This does not mean that all group members have this pattern of harm. To quote Michael Langone, Executive Director of the International Cultic Studies Association, “some cults hurt some people some of the time.” We can tell you about some and not about others. About the others, we can make no comment. We also cannot tell you precisely, when we observe psychopathology amongst ex-members of cults, how much if any of this psychopathology predated the group experience. However, again this does not render meaningless what we do know and can discern about the group-related aspects of the psychopathology we have measured, in the same way that we do not discount the trauma and psychological effects of the earthquake or the train crash just because the psychologist wasn’t able to measure the victims beforehand to see whether they had any of the same symptoms before. At any rate, there are tests that we employ that start to tease out that which might not be group related; and the same is true in the analysis of disaster survivors. An equal playing field is all that is called for here.
In 2003, in order to operationalise aspects of Lifton and Social Identity Theory, I, together with Paul Martin and Ron Burks, designed the Extent of Group Identity Scale (EGIS). This is a measure that builds on a measure I developed in my doctoral research, which measures the extent of group identification or social category “accessibility,” to use Turner and Oakes’ term, and the extent of “perceptual readiness,” to use Bruner’s term (upon which Turner and Oakes devised the notions of different levels of accessibility). Put simply, EGIS asks former members to think back and answer how much they identified with the group when they were still members. It asks, amongst other things, How important was the group to you? How much time did it take? How valued did it make you feel? How unhappy would you have felt about leaving when you were still a member? This is a value-neutral measure of how important the group was, and an attempt to measure to what extent the group had become dominant psychologically, in the way that the Totalistic Identity Theory hypothesises. If the GPA measures the group environment—the milieu, then together these two measures are an attempt to capture the category accessibility and the category fit (extent of group-relevant stimuli in the comparative context) that define, in Self-Categorisation Theory, the psychological salience of the social category or group.
We predicted that these two measures would be positively related and would also demonstrate positive relationships with the clinical measures. In particular, we hypothesized that the extent of group identity and the extent of group abuse would predict levels of depression and anxiety, and also, importantly, levels of dissociation. The predicted relationship with dissociation is particularly important as it relates to the existential loss of a dominant group identity and the associated loss of former group- and personal-level identities that preceded that. All in all, these measures should tell us how much the group filled up the person to the exclusion of other identities, and how much of a psychological hole was left when the person was no longer a member.
Now, in something of a conclusion, are some statistics! This is not as bad as when I taught statistics on a Friday afternoon or a Monday morning; but I realise it’s getting late, so I will be brief in highlighting a few key findings.
In a large overall sample of more than 567 former members, we have found significant relationships between EGIS and GPA, and between EGIS and measures of depression (the Beck Depression Inventory), dissociation (the Hopkins dissociation screen), and anxiety (the Symptom Checklist 90 Revised Global Severity Index, Derogatis, et al.). This data indicates that the extent of identity—the level of category accessibility, how totalistic the identity is—predicts levels of depression and dissociation afterward for the former members. It is important to note that these are the first and significant quantitative results of their kind, ever in the world, to show that real-life ex-members of real-life groups might have the type of social psychological structure that Totalistic Identity Theory predicts, and with clinical symptoms that are predicted as the corollary of such a cognitive structure.
Furthermore, we have reported significant treatment effects, from intake to discharge, for both depression and dissociation—the Wellspring treatment works for this population. A final twist that I must share with you is that the treatment effect appears to hold only for that part of the psychopathology that relates to the extent of former group identity. Analysis of covariance reveals that if you take out that part of the variance that is shared between EGIS and depression (the relationship), then the treatment effect on the depression disappears. The same is true for dissociation. What this further indicates, then, is that there is a specific group-related psychopathology that responds to the group-related treatment programme. The group-related psychopathology emerges then—again, for the first time, as a quantifiable outcome of this research. These results simultaneously rest on and back up the emergence of a theory of totalistic identity as an explanation in social psychological terms for the specific type of psychopathology that results, for some people, from these specific types of group settings.
In conclusion, then, where does all this get us? Perhaps closer to an understanding of how normal cognitive processes of categorisation can go awry; of how the normal formation of psychological groups and movement between them in terms of psychological salience can become restricted and dominated on one specific identity that is extreme within the frame of reference. What I haven’t had time to address in detail tonight within this field of research is how the most dangerous cults and extremist terror groups occupy a highly polarised position within the social context that exists psychologically as the person’s dominant self-identity, as a very clearly defined and extreme us-and-them categorisation.
The type of research I have outlined this evening, and the emerging theory of totalistic identity offer the possibility of a greater understanding of how individuals are psychologically drawn and converted to extremist groups and cults; and they offer possible remedies and approaches to prevent that from happening. They also give us insights into how we can intervene to depolarise and intrude cognitively into the totalistic identity structure of cult or extremist group members. They offer hope to a path to recovery, to show that a tailored treatment programme is specifically effective with this type of group-induced psychopathology: We need more Wellsprings—including one in the UK!
It is striking, too, to observe how the normal processes of deductive reasoning can, if unchecked, lead to the kind of extremist identity structures that can do so much harm to so many. It is clear then that no one is born a cult member or a terrorist—some may be more vulnerable to developing this identity structure, but no one is immune. So is there a psychological vaccine to protect us from such a pathology? Perhaps it is part of us already: the ability to use inductive reasoning, to search carefully for answers, to not jump to conclusions, to be reflexive about identifications and prejudices—and, to quote Herb Rosedale again, to ensure that our “judgements should rest on careful analyses of structure and behaviour within a specific context, rather than superficial classification.”
I am not saying this evening that people should not be able to make mistakes, to join groups and believe things that I believe to be ridiculous and even abhorrent. But human rights are not divisible; and just as I defend the right of individuals to join groups I don’t agree with, I also defend a person’s right not to be harmed or do self-harm, and I defend our right to stop someone from hurting others. Influence—at its core our psychological processing and understanding of the world around us—is everywhere around us; we see it, breathe it, and, most importantly, we think it—it is us. Without influence we wouldn’t be here, we wouldn’t have grown up. Without it we wouldn’t make mistakes; but we also wouldn’t be able, with reflection, to move on to better things—the process of education that we are all familiar with here. We must learn to treat influence with respect, to realise its pitfalls as well as its strengths; it is our friend and it is our enemy. Let us acquire the wisdom to recognise the difference.
Aronoff, J., Malinoski, P., & Lynn, S. (2000). “Are cultic environments psychologically harmful?” Clinical Psychology Review, 20(1), pp. 91–111.
Edwards, D., & Potter, J. (1992). Discursive psychology. London, Sage.
Hassan, S. (1988). Combating cult mind control. Rochester, VT: Park Street Press.
Laing, R. D., & Esterson, A. (1967). Sanity, madness and the family. London, Harmondsworth: Pelican Books.
Lifton, R. J. (1961). Thought reform and the psychology of totalism. Chapel Hill, NC: University of North Carolina Press.
Singer, M.T., & Lalich, J. (1995). Cults in our midst. San Francisco: Jossey-Bass.
Martin P. R., Langone, M. D., Dole, A. A., & Wiltrout, J. (1992). “Post-cult symptoms as measured by the MCMI before and after residential treatment.” Cultic Studies Journal, 9(2), pp. 219–250.
Turner, J. (1987). Rediscovering the social group: A self-categorisation theory. Oxford, UK: Blackwells.
About the Author
Roderick Dubrow-Marshall, Ph.D., (Nottm) is Pro Vice Chancellor (Student Experience) at the University of Central Lancashire, UK. His principal research is on social influence, including the psychological effects of cultic group membership, influence in organizational settings, and the psychological processes involved in social group identity and prejudice. In 2006, he was awarded The Herbert L. Rosedale Award, jointly with Dr. Paul Martin, for their psychological research on undue influence. He co-founded RETIRN/UK in 2004 where he is a consultant, helping individuals and families who have been adversely affected by destructive cults and other extremist and high demand/manipulative groups and attends as co-representative of RETIRN/UK as correspondent to the General Assembly of FECRIS (European Federation of Centres of Research and Education on Sects). ). He is the Chair of the Research Committee for ICSA. (firstname.lastname@example.org) (http://www.retirn.com)