You Do Not Have to Be a Fool to Be Fooled

ICSA Today, Vol. 7, No. 3, 2016, 2-10

You Do Not Have to Be a Fool to Be Fooled: An Interview With Robert Cialdini, Updated  

Cathrine Moestue

What will make people yield their comfortable lives in the West and say Yes to recruiters who want them to travel to Syria and sacrifice not only their democratic rights but also their family ties and their lives for an unverifiable cause—a cause for which religious scriptures are twisted and suicide is reframed as an honor? Who in their right minds would say Yes to that?

We can think of the obvious self–serving reasons terrorist groups have to recruit new members, and how much they profit from making them comply, conform and obey the rules of their doctrine. So it is much more intriguing to understand why people say yes to this deal. (Kendrick, Neuberg, & Cialdini, 2015)

Based on the past 70 years of research in social psychology, the answer to the question posed is that people yield to social influence to achieve one or more of three basic goals: to choose correctly, to gain social approval, and to manage self-image.

After the Madrid bombings in 2004, the International Cultic Studies Association (ICSA) arranged for an international conference on psychological manipulation, with a special focus on the similarities between the dynamics of terrorist groups and cults. Especially since the 1995 Aum Shinrikyo Sarin gas attack in Tokyo, the similarities between the two types of groups have been apparent to many investigators. The 2005 ICSA conference closed with a plenary session by Robert Cialdini entitled You Don’t Have to Be a Fool to Be Fooled.

Dr. Cialdini is one of the world’s leading social psychologists, Regents’ Professor of Psychology at Arizona State University, and author of Influence: Science and Practice (1984/2009). He has spent his life studying the mechanisms behind why people might say Yes to others and yield their power when they should have said No (hence, the title of his speech at the ICSA conference). His book Influence…, in which he laid out six principles of persuasion, is eloquent about the dangers of persuasive techniques in the wrong hands.

In his 2005 talk, Dr. Cialdini normalized how social influence might yield a Yes from individuals even if that response is not in their best interest, and how the isolation of recruits, whether in cults or terrorist groups, increases the power of these six mechanisms.

Before traveling to Madrid for the conference, I asked Dr. Cialdini if he would be willing to do an interview with me after the conference, and he kindly said Yes. I was a psychology student at the time, but my interest in and passion for understanding social psychology were already deeply motivated by my own experience in a cultic group 12 years earlier. I believe that understanding group psychology requires more than an intellectual interest in organizational psychology; rather, awareness is fundamental to understanding some of the most compelling issues of our times, such as terrorism.

The original interview was published in the Scandinavian Journal of Organizational Psychology (SJOP) in November 2005. So why revisit it now? Here are my reasons:

Not many know about Cialdini’s analysis, but in chapter 6 of Social Psychology: Goals in Interaction (2015), he analyzed Steve Hassan’s engagement in and disengagement from the Moonies to show how these same six principles can be used to counsel extremists back to society. For a full understanding of how Cialdini’s principles relate to cultic dynamics, I recommend reading that chapter. The sooner our politicians and more psychologists realize what cults are all about, the sooner they also will realize how much more can be done to combat terrorism and to help more victims of abuse.

To be fair, terrorist research into how individuals are recruited and radicalized has been accompanied by an increased interest in cults and the tactics of manipulation. But the general discussion is still dominated by the model that points to external stressors in the political, socioeconomic, and cultural context of the radicalized individual. And even though 40 years of terrorism research has firmly debunked the notion that only crazy people engage in terrorism and has yet to reveal a meaningful, stable, terrorist profile (Borum, 2010), some people still point to personality traits as an explanation.

It is important to emphasize that the explanations that point toward group processes and chance encounters with recruiters or other charismatic persons as responsible for drawing an individual into a radical group do not exclude individual demographics or personal predispositions. Such explanations only place the blame for the radicalization process outside the individual, focusing instead on the novel and stressful situation of the person’s meeting with a clever manipulator who has a cynical plan and the ability to carry out a deliberate attack on that individual’s integrity.

When I asked Cialdini about this in 2005, he said,

I think that the model that best fits how terrorists are recruited and convinced to adopt these behaviors … [is] not one that is based on their personality characteristics or deficits or special vulnerabilities. In the same way, I don’t believe it is true of former cult members who have been swept into cults … it is not because of these personality features but because of an authoritarian process that employ[s] these powerful rules for action in a way that is so persuasive that it can be extraordinarily successful in getting people to take action that, outside of that small group, they would never have dreamed to take. (Moestue, 2005)

The emphasis here is on the radical situation rather than the radical person. Social psychologists ask not who becomes radicalized, but rather how that radical change in values and identity is facilitated.

Six Principles of Influence

The six principles of influence that Cialdini talked about in his closing session at the ICSA conference in 2005 are as follows:

Note that Cialdini’s approach to persuasion discourages lying and promotes an ethical approach to building stronger and authentic longer-term relationships without the use of deception.

The excerpts from the original 2005 interview and added text that follow expand upon the original content to offer additional perspective on topics discussed in the original conversation. The order of questions and responses has been altered, and some revisions to the text have been made.

2005 Interview, Updated*

CM:1 What brings you to this conference on psychological manipulation and cultic studies?

RC: Well, you know my research has always been on the psychology of influence and persuasion. I have recognized that influence and persuasion are a central part of the way that cultic groups operate in order to recruit and retain their members, and so I was invited to speak of the overlap between these factors.

__________________

*In the following text, the initials CM stand for Cathrine Moestue and RC for Robert Cialdini. Updated commentary within the interview is indicated by CM 2016 and is [shaded/screened/boxed/whatever].

I have been researching to understand how people are persuaded in all facets of their experience, and how they are persuaded into cults and persuaded to stay in those sometimes unhealthy and destructive environments.

CM: Do you think cults use those same tactics and principles that you have described in your book?

RC: I am convinced of it. Along with the respected Spanish psychologist Carmen Almendros and my US-based colleague Noah Goldstein, I have recently done some research, which shows that former cult members not only report being influenced by the same six principles, but they also report being influenced much more intensively in their cult experiences than other individuals in noncult groups.

CM: How are terrorists using the principles of persuasion against us?

RC: …they are using these principles against us is to recruit terrorists who are indoctrinated to believe that their ultimate goals will be served by taking actions that are reprehensible to the rest of the world. But inside that small group, inside that context, they can be made to believe that this is not only acceptable, but [also] commendable behavior for them.

CM 2016: As noted earlier, jihadist recruiters have a new 51-page handbook, A Course in the Art of Recruitment, attributed to Abu ‘Amr al-Qa‘idi (a pseudonym) (2009/2010), to guide them through the art of radicalizing. Abdullah Warius and Brian Fishman wrote about this handbook in the CTC Sentinel, a publication of the Combating Terrorism Center at West Point (Warius & Fishman, 2009). In our discussion on mechanisms, social influence, and the similarities between cultic groups and terrorist groups, I want to show how the cynical advice from the manual taps into Cialdini's principles.

Because surprisingly little research or analysis has been conducted on terrorist recruitment (Borum, 2010), this manual is a valuable resource for our understanding. More specifically, how the manual shows recruitment exploits our human need to gain social approval and to manage our self-image; and its application of the principles of reciprocity, liking, and consistence, is especially useful, as the following descriptions reflect.

Reciprocity

Make small gifts to your recruit. According to Abu ‘Amr, a recruiter should build a close, friendly relationship with recruits before raising political or ideological issues. The manual instructs recruiters to invite recruits for lunch, send them text messages, and give them gifts. Because every recruit requires personal attention, recruiters are told to target only two people at a time (as cited in Warius & Fishman, 2009). This advice taps into the obligation people feel to give back and reciprocate, which, when not manipulated, is a natural flow of give and take in relationships, teams, and organizations. According to Cialdini, people are more willing to comply with requests (for favors, services, information, and concessions) from those who have provided such things first. The need to belong, to connect, and to create intimate relationships is a universal human need; but it is also this one area in which most ordinary people feel deprived and dissatisfied. No wonder, then, that recruitment efforts appear to be concentrated in relationship areas.

One way recruiters manipulate this human tendency is to give much attention to a recruit in the beginning. In the manual, recruiters are told to spend much time studying the subject (i.e., the recruit) so they get to know everything about the person being recruited first.

They can study the victim in secret as preparation, or they can use listening techniques and show a keen interest in the subject, an approach often mistaken for love. Most people naturally crave such attention, and especially young people in the transition phase into adulthood. In cult literature, the strategy is often called love bombing or grooming.

When we think of grooming as it is used online, we tend to center on the stereotypical images of innocent young girls meeting an older male sexual predator. This view makes it easy for us to see who is the victim and who is the offender. In recruitment for terrorism, however, this stereotype of grooming no longer applies. Recruitment is indeed grooming because recruiters will try to be the recruit’s friend when they are not, and they will be nice and spend lots of time with the recruit. Some will also send gifts, which taps into the most powerful principle of reciprocity. But all this is not because they love the recruit—it is because they are recruiting him.

Of course, no recruit believes he is being converted to migrate to a war zone and join a remorselessly murderous and radical group. He is “in love,” and in that state the clever recruiters plant fantasies into his head about “saving the world” or “saving all the Muslims” or “all the starving children.”

Liking

The manual instructs recruiters to praise the new recruit for behavior that supports the cause, to share her “joys and sadness” in order to draw closer to the target, and to focus on the basics of Islam without mentioning jihad. This method manipulates the natural tendency we have to like people who are similar to us, and who like us.

According to the manual, the first 3 weeks with a new recruit should be spent “being nice”; when the relationship is built, then the recruiter can introduce the concept of heaven and hell. Abu ‘Amr argues that the concept can be used as a powerful motivator, explaining that radicalization “normally happens to those who fear the torment of the afterlife and who come to know that jihad is the salvation from eternal damnation. The result is that jihad is desired and craved” (as cited in Warius & Fishman, 2009, p. 28). Another way to look at the powerful concept of heaven and hell is to see it as an emotional manipulation using the real threat of rejection. The recruit is now isolated from her family, which makes her more dependent. As humans, we fear social rejection or social disapproval. so we may go a long way not to cause conflict with people we depend on or view as friends. We want to be accepted and loved. This basic need is manipulated and is even recommended behavior in the manual.

Consistency

How can a person be manipulated to do something he doesn’t actually want to do? This can happen through what is often called the foot-in-the-door technique. In Influence…, Cialdini discusses the universal human need to be consistent with what we have already said and done. In other words, our need to have good self-image taps into the principle of consistency. The …Art of Recruiting advises recruiters to break the ice by talking about Palestine, “an issue on which there is no disagreement” (as cited by Noah, 2009). Work your way up to explaining why “democracy and parliamentary activities” are incompatible with Islam (Noah, 2009). In other words, start with something the recruit has already agreed to, and build from there.

Radicalization is often discussed in terms of the assumption that people get exposed to certain ideas, certain kinds of thoughts, and that these cognitions and feelings put them at greater risk for involvement in terrorism. But even if this assumption sounds intuitive, former members of terrorist cells and groups tell a different story: Influence and persuasion do not have to occur via cognition to be effective; all that is required is a small change in behavior or in the environment. That small change can be accomplished by the foot-in-the-door tactic, such as an appeal for humanitarian aid to victims in the war zone. Once recruits respond, they are likely to make later choices that confirm their first one.

CM: If we use scarcity of information, one of the principles from your book, as an example, how would using that principle be different in a business setting than in a cultic group?

RC: I will give you an example from one of my clients with an advertising campaign: BOSE acoustic organization. They had a new product called BOSE Wave System. They first made a campaign around the concept of new—new elegance, new sound, new design, et cetera. They were very proud of their new product, but the advertising campaign was a disaster. No one was buying.

So what we did was talk to them about the power of loss language, which tells people not about what they stand to gain by moving in the direction of their product, but what they stand to lose if they don’t, because people are more motivated to get those things that they can’t have. And loss is the ultimate form of scarcity. So we had them change just five words in the advertising campaign. Instead of saying new at the top of the advertisement, we wrote, “Hear what you have been missing.” So now, potential customers recognized that they had been missing new performance, new features, new simplicity, et cetera; and … changing those five words produced a 45% increase in the market. So it is possible to take something like the principle of scarcity and incorporate it into five words of the language that we use and produce remarkable differences in the way people respond to the very same item.

CM: How could this principle be used to recruit a cult member?

RC: What cult leaders often tell new members or prospects is that

We have the one way, which is available to you to find true salvation, or true happiness, or the one route to political freedom, or one route to best achieve your social goal. And if you don’t come with us, you will lose that one opportunity.

They will make that claim, that they are the only ones who possess this information that will allow people to reach their goals, and that people will lose that chance if they don’t take that route.

CM 2016: Normally, when we try to persuade someone, we appeal to what they might gain or save by moving in our direction. But as Cialdini points out, research on social influence tells us that framing something as a loss will tap into the principle of scarcity, and that is highly motivating. Recruiters know this, and it is reported to be a main strategy to move recruits when they are uncertain.

Steve Jobs is famous for having asked John Scully (CEO of Coca Cola), ”Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” This question was also tapping into the principle of scarcity, using loss language by pointing to the fact that Scully might miss out on something important. Needless to say, John Scully left Coca Cola and followed Jobs into Apple.

The same strategy is evident in many recruitment tactics online. Recruitment videos appeal by saying, “Give up your fat car, your family, and come to Jihad for the sake of Allah!” When people are uncertain, telling them what they will lose if they fail to move taps into the principle of scarcity in a powerful way.

CM: Why do we have so much resistance to seeing ourselves as vulnerable to these influencing tactics?

RC: We don’t want to believe that we are so influenceable, I think. We have done a study at my university on the extent to which students were influenced to change their attitudes by certain kinds of advertising that featured an authority spokesperson speaking on behalf of some products. Some of these ads hired an actor to wear a doctor’s coat and a stethoscope, for example; and sometimes they would hire a celebrity who played the role of a doctor on TV to be a spokesperson for some kind of pain reliever, and so on.

When we showed students these ads, they were significantly more persuaded by these phony expert spokespeople than by ads that didn’t include [them]. But they didn’t want to believe that they were persuaded by it, … that they were so easily influenced by weak evidence, by something that was clearly bogus. If they had just thought about it, they would have known that this actor doesn’t possess any special medical knowledge—why should they believe what he says about a pain reliever? But they did.

Because these principles that we talk about are the shortcuts that we all are required to use in modern life in order to make our decisions. We live in what is unquestionably the most information-overloaded and stimulus-saturated environment that has ever existed on the planet. We need our shortcuts in order to make our decisions.

What the cults do is to take that information overload, that stimulus saturation and cognitive overload, and intensify it beyond what we normally experience outside of the cult. And that makes cult members even more susceptible to these principles because they have to rely on their shortcuts, because they are not able to think in a concerted, conscientious, and systematic way.

CM 2016: We often avoid the pain of uncertainty by looking outside of ourselves toward role models (authorities) and peers (social proof). Recruiters can structure context so there appears to be even more uncertainty than troubled youths were experiencing previously. This environment enables the recruiters to manipulate and coach young recruits toward destructive behavior (unlike true leaders, who coach toward prosocial behavior). The recruiters fake both their expertise and the available social-support groups; as long as the recruits’ perception is that these elements are real, that perception becomes their reality as victims in the radicalization process.

An interesting example of how we might miss these social forces is when we discuss the radicalized youth of Molenbeek. Frequently we blame government policies, lack of immigration efforts, or their low social status for their radicalization. But it turns out that the radicalized youth were actively recruited. Philippe Moureaux, who served for two decades as Molenbeek’s mayor, described this as “the paradox of integration.” A less-integrated Turkish community has resisted the promise of redemption through jihad offered by radical zealots. Yet the members of a Moroccan community who are more at home in French-speaking Brussels have seen some of their young fall prey to recruiters such as Khalid Zerkani, a Moroccan-born petty criminal who became the Islamic State’s point man in Molenbeek (Higgens, 2016). This is a powerful example of how, when they are uncertain, most people will look at what others just like them are doing. Here, the proof of a correct choice isn’t based on knowledge or logic or empirical evidence; it’s based on social evidence of what one’s peers and those in one’s social network have decided to do. Cialdini notes (2010) that deceptive recruitment or fraud of this sort is hardly limited to one ethnic or religious group. For example, Charles Ponzi, who gave his name to the infamous Ponzi scheme that Madoff copied, was an Italian immigrant to the United States who fleeced other Italians.

CM: Most people think that cult members are foolish somehow, or very naïve. Do you think that is right?

RC: No I don’t think that is right. In fact, the title of the talk I gave in the conference is You Don’t Have to Be a Fool to Be Fooled. We are all taken in by these principles to purchase products, to vote for candidates, to contribute to causes. These principles reflect the way we work as human beings in modern society; we can’t simply abandon them. They are the way we decide frequently who[m] to say Yes to and who[m] to say No to. So true authorities are to be followed—it makes sense to do so. It would be foolish to abandon the idea that experts provide important diagnostic information about how we should comport ourselves in a particular situation; we can’t just abandon the idea that experts are a good source of information. So we are all susceptible to them [the deceptive users of these principles for their own motives].

CM: You mentioned cognitive overload in your lecture yesterday, and said this was the one thing that made cultic experience a little bit different from the type of influence of other groups, although the use of the principles remains the same.

RC: Yes; in cults the environment is controlled in two ways. One is that it is entirely full of cultic activity and ideas, and there is really no time for anything else… The other way [is that] all the principles can be used constantly. The one thing that separates cultic groups from other influencers such as advertisers, marketers, and fundraisers who use these tactics is that the influence environment in cults is pervasive; it is constant and there is no escape from it. That’s why in the data we collected recently we find that former cult members, more than any other group members, report more intensive exposure to each of the principles in my book.

CM: You have a lot of knowledge about influence. Do you think it is at all possible for you to be recruited into a business cult, a pyramid cult, or a religious cult?

RC: This is a good question because when I did the research for my book, one of the things I did was to infiltrate the training programs of as many of the influence professions in society as I could get access to. I learned how to sell automobiles, portrait photography, insurance, encyclopedias, et cetera. … I also infiltrated some fund-raising organizations to see what they did to be more successful—to train their recruits and be more successful getting donations and contributions for their causes. But I never felt comfortable infiltrating cultic groups because I read stories about journalists who, under the guise of becoming a recruit so they could write their story, joined cults, but they haven’t come out yet.

CM: Why do you think that is?

RC: That is because of the power of the principles of influence. The cults are able to employ them in settings that they control completely. They don’t control in a physical way members' ability to leave, but they control [it] psychologically, by having complete control over the information and communication environments through which people exist in those groups. They are surrounded by others who believe the same thing. They are led by a charismatic figure, which pulls attention and focus in their direction and isolates members from all other sources of information as to what constitutes the truth. And before long, people are swept into a state of believing without being critical because everything around them tells them there is no need to be critical. Also, their ability to be critical is undermined by things such as information overload, poor diet, lack of sleep, and so on. All this makes it difficult for people to step back from the situation and do the hard work of counter arguments.

CM: Can you say something about your research?

RC: In our research we use what is called The Group Psychological Abuse (GPA) Scale [developed by Chambers, Langone, Dole, and Grice in 1994], in which we ask former cult members and individuals who were simply members of other groups—music groups, sport groups, study groups, and so on—to report the extent to which they experienced certain kinds of influence attempts to bring them into the group, and to retain them as members of the group. Each of the items we included on the scale was related to one or the other of the six principles of influence. And what we found is that, in every case for every one of the principles of influence, former cultic group members reported having more intensive and frequent exposure to these forms of tactics than any other group member.

CM 2016: “Group Psychological Abuse: Taxonomy and Severity of its Components,” by Alvaro Roderiguez-Carballeira et al. was published in 2014 in The European Journal of Psychology Applied to Legal Context (Roderiguez-Carballeira et al., 2014). This taxonomy of the GPA strategies consists of 6 scales: (a) Isolation, (b) Control and manipulation of information, (c) Control over personal life, (d) Emotional abuse, (e) Indoctrination in an absolute and Manichean belief system, and (f) Imposition of a single and extraordinary authority (2014). The researchers found that these tactics and their ultimate aim, to subjugate the individual, had adequate content validity. The operationalization of the strategies as detailed in this research study contributes to our understanding of radicalization as a phenomenon, and of how to deradicalize victims of abuse. But it is also useful when we are studying the tactics in the …Art of Recruitment manual (Abu ‘Amr, 2010).

CM: In Norway, we recently had a type of pyramid group called Five Present Community, I think…, where lots of people lost their money. When the media writes about these victims, they call them greedy and naïve, [implying] that somehow because of their greed they deserve to lose their money…. Do you think greed is an issue here?

RC: I have infiltrated these types of pyramid organizations, and they were very clever in the way they described what being wealthy would allow. They said to us, “If you had all the money you wanted, what would that allow you to do?”

And do you know what these people said? “It would allow me to buy my mother a home that she had never had, and it would allow me to give my children the education that I am afraid that I can’t do now.” So their intention wasn’t just to be greedy.

The organization let people talk about how the goal of this organization would allow them to achieve purposes that transcended their own personal interest, which made it acceptable, and made it legitimate to do the things they were then asking of people—to work 15 and 16 hours a day, to get friends and relatives into it because they came to believe that… [these friends and relatives] could get wealthy too and achieve their goals.

So these organizations are never selling a product; they are never selling whatever it seems that they are selling. They can sell anything from motivational tapes and information, to hammers, to air cleaners—anything. They are always selling people dreams that will allow them to extract themselves from their worries and concern, and …achieve their grander goals. So it is a very persuasive technique.

CM: How can we defend ourselves against undue influence?

RC: I think it is necessary to be knowledgeable of what these six fundamental dimensions are within us that cause us to decide…. These are things that guide our behavior. If we are aware of them, and we encounter a situation where, for example, we like someone more than we should under the circumstances—he or she has done something to cause us to feel … more positively than we should under those circumstances—[then we will] have a flag go up in our minds [to] take a step back from the situation and analyze the merits of what is being requested rather than the way it is being requested or who is requesting. For example, if we buy a new computer in a store, and we like the sales person, we have to remember that we are carrying the computer out of the store and not the salesperson. Separate the thing from the presentation of the thing.

CM 2016: In Influence… (1984/2009), Cialdini outlines a number of creative strategies for resisting compliance, obedience, and conformity, and each chapter ends with advice on how to resist manipulation. But for people living in isolation who are under constant cognitive overload, such as people living in cults or terrorist groups, the mindful and reflective state of mind necessary to resist these subtle social pressures is not likely to happen.

CM: Are you afraid that people would use the knowledge in your book for unethical purposes?

RC: This is an excellent question. The book was written initially for consumers to learn how to recognize and resist these principles when they are used on us in unwelcome and undue fashion. Here is the interesting thing: Not a single consumers’ group has ever called me since the publication of that book. But my phone hasn’t stopped ringing with requests from advertisers, marketers, attorneys, fundraisers, and political lobbyists who say, “Come and talk to us about how we can harness these principles.”

When the purpose changes from rejection of these principles to employing them, then ethical issues become very important. So when I do talk to business entities or other professional associations, the talk is on ethical influence. How can we become effective without exploiting? The same principles can be used. The key is to uncover them where they exist naturally in the situation, and to raise them to consciousness in the minds of our audience. So if we have true expertise and we are truly knowledgeable, then we are entitled to inform people of that; and they want to know that… If it is true that we are the fastest-growing [company] or have the largest-selling product, it is entirely acceptable for us to describe that. It is not acceptable to take an actor in medical clothing to suggest expertise he doesn’t have—that only smuggles the authority principle into a situation where it doesn’t naturally reside, and it steers people incorrectly.

CM: Who in the world really needs to understand these principles?

RC: I would like the politicians of our societies to think about honestly informing people when they want us to move in a certain direction, because as politicians they represent so many of us and affect so many of us in their decisions, that scrupulous honesty and consideration of this process would be best for all concerned.

CM 2016: Given the complexity of social influence and how predispositions interact with chance meetings, the issue of terrorist recruiters is frequently lost in polarized public debates. Public discussion often identifies either Islamic scripture, or government policies, or the low social status of the new recruits as the sole reason for all terrorism. Issues of identity have long been recognized as central to radicalization, but how that sense of identity is even further attacked and changed in the process of recruitment has not. It is my hope that, by understanding Cialdini’s principles of influence and introducing them as tools to normalize indoctrination, the complex mechanisms and tactics involved become less of a mystery. The human need to belong, to feel good about ourselves, and to make correct choices is not unique to Muslims; it is part of our common humanity.

References

Abu ‘Amr al-Qa'idi; Abu Mujahid & Abu Klialid (Trans.). (2010). A course in the art of recruiting. Retrieved from https://archive.org/stream/ACourseInTheArtOfRecruiting-RevisedJuly2010/A_Course_in_the_Art_of_Recruiting_-_Revised_July2010_djvu.txt

Borum, Randy. (2010). Understanding terrorist psychology. Mental Health Law & Policy Faculty Publications. Paper 576. Retrieved online at http://scholarcommons.usf.edu/mhlp_facpub/576

Cialdini, R. (1984/1993/2001/2006/2009). Influence: Science and practice (5th ed.). Boston, MA: Pearson Education.

Cialdini, R. (2005, July). You don't have to be a fool to be fooled. Plenary session presentation, ICSA Annual Conference, Madrid, Spain.

Cialdini, R. (2010). You don’t have to be a dupe to be duped: Lessons from the Madoff affair. (Inside Influence Report, Influence at Work. Retrieved from http://www.influenceatwork.com/wp-content/uploads/2012/02/Madoff_by_Cialdini.pdf

Higgins, Andrew. (2016, April 19). A close look at Brussels offers a more nuanced view of radicalization. The New York Times. Retrieved from http://www.nytimes.com/2016/04/20/world/europe/more-than-islam-origin-is-a-marker-for-terror-among-brussels-immigrants.html

Kenrick, D. T., Neuberg, S. L., & Cialdini, R. B. (2015). Social psychology: Goals in interaction (6th Ed). Boston, MA: Pearson Education.

Moestue, Cathrine. (2005), A talk with Dr. Cialdini in Madrid. Organisational Theory and Practice: Scandinavian Journal of Organisational Psychology, 16(2), 63–69.

Noah, Timothy. (2009, March 24). Jihad lite: Al-Qaida's dumbed-down recruitment manual. Slate. Retrieved from http://www.slate.com/articles/news_and_politics/chatterbox/2009/03/jihad_lite.html

Rodríguez-Carballeira, Álvaro et al. (2015). Group psychological abuse: Taxonomy and severity of its components. The European Journal of Psychology Applied to Legal Context, (7)1, 31–39. Retrieved from http://ejpalc.elsevier.es/en/group-psychological-abuse-taxonomy-severity/articulo/S1889186114000171/#.V1NJ3Vc9bHg

Warius, A., & Fishman, B. (2009, February). A jihadist´s course in the Art of recruitment. Combatting Terrorism Center at West Point. Retrieved from https://www.ctc.usma.edu/posts/a-jihadist’s-course-in-the-art-of-recruitment

About the Author

Cathrine Moestue, Cand.Psychol., grew up in Oslo, Norway in an upper-middle-class family with four siblings. While attending Folkuniversity in Stockholm (1984–85), she encountered teachers who claimed to have a program to “save starving children” and lured her to participate. The group, which drew on communist teachings, isolated her from her family and made her feel guilty for her privileged upbringing. After years of working hard to “save the world,” she became disillusioned and, after several attempts, in 1992 she successfully escaped this destructive group by running away. She worked in the advertising industry and managed a radio company before earning her degree in psychology at the university of Oslo and becoming a psychologist and eventually seeking therapy to deal with her traumatic experience. She is a psychologist in private practice in Oslo and is currently working on her memoirs.