Articles‎ > ‎

The Influence of Persuasive Strategies Used by Cultic Groups in the Context of Forewarning


The Influence of Persuasive Strategies Used by Cultic Groups in the Context of Forewarning

Dariusz Krok, Ph.D.

Department of Psychology

The Opole University, Poland

Abstract


One noticeable feature of cultic groups is their use of persuasive strategies that intend to change individuals’ attitudes without their awareness and consent. The strategies are based on specific functions of attitudes that make attitudes very persuasive. The purpose of this article is to investigate the impact of persuasive strategies characteristic of cultic groups in the presence or absence of forewarning. By forewarning, we refer to the group’s true identity having been revealed. To examine the impact of persuasive strategies, an experiment was conducted in which 212 full-time and part-time students were randomly assigned to one of two conditions—either warned or not warned of the group’s identity, and to three kinds of strategies: existential, cultural-religious, and protective. The most negative effects occurred under the protective strategy, and the least negative effects under the existential strategy. Forewarning caused more negative cognitive processes and attitudes from students toward the message and speaker and decreased the speaker’s persuasive impact. The forewarning had the strongest effect under the existential and protective strategies, but it did not affect the students when the cultural-religious strategy was used.

Persuasive strategies are widely used in many areas of social life. Their main aim is to change people’s thoughts, emotions, and behaviors so that the individuals become susceptible to someone’s instructions and orders. Cultic groups apply persuasive strategies imbued with strong arguments so as to draw individuals’ attention and persuade them to join the groups. These actions make use of sophisticated means—e.g., hiding the true identity or name of the group—directed at changing someone’s attitudes without the person’s conscious control and consent.
Persuasive Influence in Cultic Groups

Proponents of persuasion research acknowledge that people change their attitudes while they are interacting with their social environment. Therefore, understanding the complexity of persuasive dynamics is useful for both basic researchers who try to broaden our knowledge about social information processing and for practitioners in many areas of social life (Bohner & Wanke, 2004). In psychology, persuasive communication is understood as a “message intended to change an attitude and related behaviors of an audience” (Hogg & Vaughan, 2005, p. 200). Many examples of persuasive influence are clearly visible in social interactions in which individuals are exposed to messages aimed at changing their attitudes. Persuasion has been used in developing appropriate strategies for health-related interventions (Briñol & Petty, 2006), producing messages that effectively communicate information associated with products and services (Rucker & Petty, 2006), making decisions in the political context (Bizer & Petty, 2005), and building a wide variety of advertising strategies aimed at persuading people to buy certain products (Kardes, 2005). The modern study of persuasion, which is based on cognitive psychology, analyzes variables and processes responsible for changing attitudes and related behaviors.

In general, persuasion can have two different aims: positive and negative. The first intent refers to interventions in such areas of human life as health, education, marketing, social advertisements, and so on. Persuasion consists of presenting certain kinds of information in order to change people’s attitudes. This persuasion tries to obey ethical rules and respects human values. We find the second intent in psychological manipulation, brainwashing, and propaganda. Persuasion in these contexts can violate human freedom and views by presenting information that is not always true, or by using means that distort the recipients’ thinking.

Persuasive strategies can be presented in the form of manipulation as a means of social influence directed at changing individuals’ thoughts, emotions, and behaviors without their awareness. In many contemporary social life situations, people are exposed to ambiguous and unclear messages that senders create to serve their own interests. People are not oblivious to persuasive influence. Although some of us may think that we are less likely than others to be influenced, in fact we are susceptible to persuasive attempts at the same level. The reason for this is that our cognitive and emotional processes happen on two levels: conscious and unconscious. Cialdini and Sagarin (2005) point out that many influence strategies capitalize on our tendency to respond automatically to certain cues in a persuasive message or influence situation. Although our attitudes can be changed when we are processing in a deliberately thoughtful manner, we are particularly vulnerable to strategies that are based on simple cues and shortcuts that elicit automatic responses. The latter pose a danger because they happen beyond our conscious thinking.

Cultic groups can use both kinds of persuasion while they are recruiting new members. Cultic groups are organizations characterized by (1) manipulation used to recruit new members; (2) strong relationships between the leader and members; and (3) negative psychological, social, and physical consequences for its members (Bukalski, 2006). This description requires additional clarification. The three characteristics do not necessarily have to be present to consider a group under the label of cultic. Even though the first two characteristics occur, we cannot always assume the effect inevitably would be negative psychological, social, and physical consequences. In addition, the negative effects could be true for some of the members but not necessarily for all of them. It appears that cultic groups may have negative psychological, social, and physical consequences for some of its members (Aronoff, Lynn, & Malinoski, 2000; Langone, 2001). The extent of potential harm would depend on various individual and social factors that influence cognitive, emotional, and behavioral processes (e.g., people’s personality traits, the strength of the influence, the length of time individuals spend in the group).

Cultic groups utilize persuasive manipulation, which is based on a wide range of processes and tactics that have been examined in the field of psychology. One of the most common tactics is ingratiation, which is an attempt by an individual to get someone to like him or her in order to obtain compliance with his/her request. A meta-analytic review of 69 ingratiation studies conducted by Gordon (1996) showed that ingratiation is positively related to perceptions of increased likeability and makes the influence more effective. Many areas of social life use ingratiation—e.g., marketing, sales, negotiations. In a cross-sectional organizational field study, Blickle (2003) investigated the effects of ingratiation, pressure, and rational persuasion on performance appraisal, compliance gaining, and reactance. The results confirmed the hypothesis that the more a person uses ingratiation and the longer an assessor has known the person, the more positively the assessor will evaluate the person’s compliance-gaining success. Ingratiation is often found as a means of influence in cultic groups (Nowakowski, 1999). Group members try to influence others by first agreeing with them and getting the individuals to like them. Next, the members make various requests of those individuals that will benefit the cultic group.

Manipulation often exploits basic tendencies of human behavior that play an important role in generating positive responses to a request. Cialdini (1993, 2004) lists six such principles used in persuasion:
Reciprocation. The reciprocity principle is based on the social norm that “we should treat others the way they treat us” (Hogg & Vaughan, 2005, p. 217). This tactic refers to an attempt to gain compliance by first doing someone a favor. But reciprocity includes more than favors and donations. It also applies to concessions that people make to one another; for example, if someone rejects my large request, I then make a concession to the person by retreating to a smaller request. Members of cultic groups often use this tactic while they are collecting money or trying to convince people to attend their meetings. In the beginning, they may do someone a favor or help with some work. Later, the members will ask the person to do something for the benefit of the group.
Consistency. This principle refers to people’s desire to be perceived as consistent in their words and behaviors. A person can convince someone to act in a certain way if the persuader is able to show that the person being asked to conform perceives the desired behavior as logical. Cultic groups try to present their teaching as being as consistent and reasonable as possible; for example, by drawing on scientific evidence.
Social validation. This principle aims to convince a person to agree to a demand by explaining that many other people have adopted the same behavior in the past. People tend to observe others and acknowledge their behavior as appropriate when they encounter an unknown situation (Kropveld & Pelland, 2006). If many individuals have decided in favor of a particular idea, other people are more likely to follow because they perceive the idea to be more correct and valid. One example of this principle’s use in the context of cultic groups is when the group makes the following statement: “A few thousand people have joined our group since last year.”
Liking. The liking principle is based on the notion that individuals are willing to accept proposals made by a person they know and respect. People prefer to say “yes” to those they like. Physical attractiveness, similarity, compliments, and cooperation are the tools that expedite liking and make people more vulnerable to requests. During the first contact, cultic groups often express a friendly and positive approach toward potential members, which is aimed at encouraging them to join the groups.
Authority. Individuals are more likely to conform to demands that come from someone who is a figure of authority. This principle derives from norms of our upbringing and education; since childhood, we are taught to respect the authority of teachers and parents, and to agree to their demands and opinions. Use of this principle is evident in cultic groups when prominent and famous people are invited to talk about the groups’ ideas.
Scarcity. Opportunities become more desirable to us as they become less available. The scarcity principle points out that people place greater value on rare opportunities because they are perceived as unique and exclusive. Cultic groups make use of this principle when they present their teachings as unique and different from other religions.

All the above principles can be used to affect individual’s decisions and obtain their consent. The main danger of these mechanisms is that they influence people beyond the conscious level and beyond objective thinking and so deprive those individuals of the possibility of making fully independent decisions. The principles cultic groups use become a means of manipulation that leads to behaviors that sometimes are difficult to consciously control (Abgrall, 2005). To understand manipulation through which individuals are recruited by cultic groups, it is important to look at persuasive strategies they use.
The Role of Persuasive Strategies and Forewarning in Cultic Groups

In manipulation, cultic groups often use religious and moral information that is supposed to make a strong impact on recruits’ thinking, emotions, and behavior. Theoretical analyses and results of empirical research provide evidence that persuasion in religious and moral communication plays an important role in changing and forming attitudes, especially those that are relevant to individuals’ religious life (Dotson & Hyatt, 2000; Krok, 2005). Cultic groups want to control their members, so they focus on presenting information in a persuasive and convincing way.

Persuasive strategies used by cultic groups can include providing false information, withholding or distorting relevant information, inducing emotions, and controlling people’s thinking. The Elaboration Likelihood Model (ELM) provides explanations for how people can be manipulated by processes that rely on the peripheral route of persuasion. This route involves minimal cognitive elaboration of a message, which causes people not to carefully examine arguments included in the message and to make conclusions in a superficial way (Petty, Cacioppo, Strathman, & Priester, 2005; Petty, Rucker, Bizer, & Cacioppo, 2004). By using peripheral mechanisms, cultic groups can easily take advantage of people, persuading them to follow rules and make decisions. The lack of objective thinking creates dangerous situations in which people are psychologically abused, brainwashed, and exposed to negative emotional states. Cultic groups expose people to persuasive messages that are designed to alter their attitudes with the assumption that changing an attitude in the desired direction will result in a behavioral change in line with the new attitude.

Further analyses of persuasive strategies reveal various types of influential approaches cultic groups use. Abgrall (2005) states that cultic groups try to provide answers for four types of universal questions: (1) existential questions—such questions are related to experiences of frustration and loneliness, general life plans, and a search for life’s meaning; (2) cultural and religious questions—the main concern of such questions is about issues of metaphysics, the beginning of the universe, and teachings of great religions; (3) protective questions—these questions aim to find a solution to the problem of evil in the contemporary world in order to ensure individuals’ safety; (4) ideological questions—the relevance of such questions is connected to the entire vision of the world and human life. Each type of question can include relevant strategies through which cultic groups might influence individuals.

We can explain the reasons that various persuasive strategies make an impact on people’s behaviors on a basis of functions of attitudes. Attitudes serve four basic functions: (1) utilitarian, (2) value-expressive, (3) social adjustive, and (4) ego-defense (Eagly & Chaiken, 1998). Attitudes may serve these functions to enable people to evaluate and appraise stimuli in their environment. These functions also provide an explanation for why people are susceptible to various kinds of social influence.

The utilitarian function considers the ways that attitudes help individuals to maximize their rewards; that is, attitudes have some utility for individuals. In terms of usefulness, attitudes can represent all kinds of outcomes, including the gains of self-reward (e.g., pride, self-esteem), and the losses of self-punishment (e.g., guilt, anxiety). In the context of a small social group (e.g., a cultic group), liking a particular way of life makes the member accepted within this group; and so holding and expressing this attitude will have a rewarding value for him/her.

The value-expressive function emphasizes that attitudes provide a means for expressing personal values and other core aspects of self-concept. Having certain attitudes is inherently gratifying because they are a source of satisfaction to individuals and affirm their self-concepts (Erwin, 2001). A person could be motivated to hold attitudinal views that accurately reflect ideologies and norms of a cultic group because those views enable him/her to express important personal values.

The social adjustive function reflects the ways in which attitudes mediate a person’s relations with others. Holding attitudes that are pleasing to others can facilitate social relationships with these people and help to maintain those relationships, whereas expressing unacceptable attitudes can break such relationships. Cultic groups may take control over an individual who needs to maintain personal relationships with the group’s members. The control is not necessarily total because cult members are capable of making reasonable decisions or rational choices; but the decisions and choices are based on a modified structure of preferences and values that conforms to the group’s ideology (Lalich, 2004; Zablocki, 1997).

The ego-defensive function underlines the notion that attitudes allow the self to be defended from potentially threatening events. Such attitudes protect us from harsh external realities, buffer our ego, and provide rewards. Prejudice is often regarded as an example of attitudes that serve this function (Amodio & Devine, 2005). People who are prejudiced might bolster their own egos by feeling superior to members of outgroups and therefore easily accept negative views about the members. Such people might be easily exploited by cultic groups that will isolate them from “external threats.”

The functional basis of people’s attitudes enables us to understand how attitudes are changed (Crano & Prislin, 2006). Functional theory assumes that persuasive appeals that address or match the function an attitude serves will be more influential than appeals that are irrelevant to this function. For example, we should offer value-relevant arguments to the person whose attitude serves a value-expressive function (Petty, Wheeler, & Bizer, 2000). Results of many studies have shown that persuasive appeals are more effective when they present arguments matching the function underlying an attitude than when the appeals present arguments that do not match. Functional match of a message is reported to directly influence the person’s perception of the message’s validity, which in turn affects postmessage attitudes (Lavine & Snyder, 1996; Marsh & Julka, 2000). We can apply the principle of the functional matching effects to persuasive strategies cultic groups use to explain why people are influenced in different ways, depending on the sort of information presented to them.

Another important feature of persuasive manipulation that members of some cultic groups use is the extent to which they reveal their identity and intentions at the beginning of the first meeting. When they approach people, the members usually try to involve them in discussion, asking questions and talking about their philosophy, ideas, and teaching. In some cases, the members introduce themselves saying, “We belong to the group called X”; in other cases, the members do not reveal their affiliation and true name, so as not to discourage people from carrying on a further conversation. The second situation is characterized by deliberately hiding important information about the real identity of the group. The member mentions the topic and message position, but not details of his/her group. Of course, this manipulation aims at gaining trust so that a recipient might be drawn deeper in the group’s ideas. The approach is similar to “a camouflage method,” because it is based on hiding one’s identity in order to yield a future profit (Bukalski, 2006). We can assume that if a person approached by a member of a cultic group that has a negative image was told straight away that he/she is currently dealing with the cultic group, then that person’s reactions would be negative. The reactions will be less negative if the person hears about appealing ideas and attractive promises related to the group.

In terms of persuasion, revealing the identity can be compared to the concept of forewarning, which occurs when an audience is informed about some aspects of the person or message they are about to encounter. The forewarning might take one of the following two forms (Hargie & Dickson, 2004):
A persuasive intent statement. In this instance, the forewarning comes from the speaker, who tells the target that a persuasion attempt is about to be made (e.g., “I’m going to present information about my group X, which encourages people to believe in such and such ideas”).
A topic and position statement. This statement informs the target about the issue that is going to be presented and discloses the speaker’s view on this issue (e.g., “Next week I will talk about economic aspects of redevelopment of our company, which I fully support”).

Psychologists have analyzed forewarnings by considering them in relation to other factors that activate resistance to persuasion (Kiesler & Kiesler, 1964). Research done in the field of persuasion has shown that forewarning can effectively confer resistance to persuasion compared to a lack of forewarning. Cialdini and Petty (1981) applied the above-mentioned two forms of forewarnings and concluded that forewarning that communicates the persuasive intent typically instigates resistance; whereas forewarning that conveys the topic and position statement can generate resistance or susceptibility. Forewarning makes a difference mainly for people for whom the topic presented is of high relevance. Petty and Cacioppo (1979) confirmed this finding. In their experiment, students were either forewarned or were not warned of the persuasive intent of a speaker featured in a taped radio editorial. When the issue was of low relevance, the warning did not have any impact on the listeners’ attitudes. However, when the issue was of high relevance, the warning tended to reduce persuasion, despite the fact that the message contained strong arguments.

The study of Zuwerink-Jacks and Devine (2000) yielded slightly different results. This study examined the effects of forewarning in relation to people’s attitude importance. High-importance individuals, i.e., people who attach great personal importance to their attitudes, were more resistant to a counterattitudinal message than their low-importance counterparts, i.e., people who evaluate their attitudes as less important. Extending these findings, it was shown that high-importance individuals were resistant regardless of the warning (warned vs. unwarned) and delay (0 min. vs. 2 min.) manipulations. Low-importance individuals became more motivated and willing to defend their attitudes when warned and given time before hearing the message. The process analysis suggests that both cognitive and affective responses mediated the effects of importance, warning, and delay on final attitudes. There is a difference between these results and those Petty and Cacioppo (1979) obtained, which suggests that personal relevance involves different processes than attitude importance. Thus, the results in studies related to forewarning are diverse.

In his meta-analysis, Benoit (1998) examined the effects of forewarnings on recipients’ attitudes. His final conclusion stated that warnings of pending messages or information generate resistance. He reviewed 12 studies in which individuals who were warned prior to the message were less persuaded than individuals who received the message without warning. Another of Benoit’s findings was the uniformity of resistance that occurred across a variety of studies. Wood and Quinn (2003) have given an interesting complete discussion of forewarning effects. They evaluated the effects of forewarning on attitudes during presentations of appeals to influence. The final results showed that warnings appear to threaten people’s attitudes or their self-images, and the impact that warnings make depends upon which aspect of the self was threatened. The impact is conditioned by the extent of involvement. When the appeal was involving and concerned immediate outcomes, or when the appeal was actually delivered, recipients tended to focus on the potential threat to their attitudes and resisted the appeal. Alternatively, warnings of appeals on less involving topics produced agreement before the appeal was delivered. It might have occurred because these warnings made people aware of the self-image threat of being gullible and preemptive agreement decreased this threat.

In the context of cultic groups, it would be interesting to ask what happens when people are approached by a group’s members with a persuasive appeal that does or does not contain forewarning—e.g., an explanation about—the identity of the group. Given that the impact depends on messages and forewarning, the main goal of this research was to examine the influence of various persuasive strategies used in the context of either revealing or hiding the true identity of the group on individuals’ attitudes.
Method
Material Pretesting

In a pretesting session, three persuasive messages were created. The messages were based on the following strategies used by cultic groups: existential, cultural-religious, protective. The main rules of the messages were drawn from descriptions given by Abgrall (2005).[1]

The existential strategy presents arguments that refer to a search for meaning in life and daily problems and difficulties. The speaker describes one’s existence in terms of psychological needs and tries to encourage the listeners to analyze their own lives in order to see the positive and negative sides. He points out that his group is able to answer all these questions and give some kind of help. The group can also provide support for all kinds of existential problems present in one’s life and find the right solutions for how to live in a happy and satisfying way.

The cultural-religious strategy reflects universal questions about one’s sense of the universe, the beginning of life on the earth, and meanings of great religious books. This strategy is based on the assumption that the world is filled with mysteries and unanswered questions that surprise human beings and make them think. People have been always striving to find answers for philosophical questions regarding the world, and to understand religious matters. Here is assurance that the group will help one discover the mysteries and find all the answers connected to the universal questions.

The protective strategy consists of information that is to secure a person against current dangers and provide peacefulness, safety, and happiness. The speaker observes that contemporary social situations are characterized by a lot of problems and potential threats. Our society does not provide enough protection, which results in feelings of insecurity and anxiety among people. The group will provide all the necessary means needed to defend the person from current dangers and will secure a safe environment. All the messages were completely different.

Thirty-four academic staff and students specializing in the field of religion and cultic groups assessed the texts. They were given descriptions based on Abgrall’s text (2005), and then they listened to the three messages. Having listened to each message, they marked on a 9-point scale (entirely compatible to entirely incompatible) how much the strategy tallied with its description. The duration of the three messages was comparable (ca. 4 min. and 15 sec.) Then, the messages were recorded on a CD and used in the experiment. The person who was speaking on the CD was instructed to convey the message in a persuasive and compelling way, an approach that can be also found in cultic groups.
Research Participants and Design

A total of 212 full-time and part-time students were recruited to participate in the experiment in partial fulfillment of a class requirement. Participants were randomly assigned to one of two conditions—warned or not warned of the group’s identity, and to three kinds of strategies—existential, cultural-religious, and protective. Each group listened to one message played on a CD player.
Procedure

All participants were told that the study aimed at measuring different aspects of people’s attitudes and opinions about existential, religious, and moral issues. Before being asked to report their own opinions, the participants were informed that they would be listening to a message related to these issues. Following these introductory remarks, participants were or were not informed of the identity of the speaker. Those persons in the warned group were told that the message was recorded during a cultic group meeting and that the speaker was a group member. Participants in the unwarned group were informed that they would listen to a speech about various existential, religious, and moral issues.

Afterward, participants from both groups were asked to write in a series of “thought-listing boxes” any thoughts (favorable, unfavorable, or neutral to the message) that they had while the message was being played. This is a “thought-listing” procedure widely used in persuasion research (Petty & Cacioppo, 1986; Petty et al., 2004). Next, participants completed two attitude measures. The first attitude measure toward the message consisted of ten semantic differential scales (e.g., favorable to unfavorable, positive to negative, interesting to boring, logical to chaotic) (Cronbach’s α = .88). The second measure assessed the speaker’s overall performance and impressions he made on recipients using a set of ten semantic differential scales (e.g., competent to incompetent, intelligent to thoughtless, trustworthy to untrustworthy) (Cronbach’s α = .86). Responses were averaged such that higher numbers indicate more favorable attitudes toward the message and speaker.
Results
Coding Thought Listings

For the thought-listing data, two independent judges categorized each thought as being favorable, unfavorable, or neutral toward the content of the message. Judges agreed on 83% of the thoughts listed. The third independent judge resolved disagreements. Based on this coding, the thoughts index for each participant was formed by subtracting negative thoughts from positive ones and dividing by their sum (PT-NT)/(PT+NT). Thus, the index reflected favorable attitudes of participants toward the message.
Thought Analyses

To examine the impact that strategies and forewarning had on recipients’ reactions, we conducted a Strategy x Forewarning analysis of variance (ANOVA) on the thought index (3 x 2 ANOVA). The interaction Strategy x Forewarning was not significant for this measure F(2,206) = .04, p<.66. Nevertheless, there were two main effects. The most significant effect of Strategy indicated that the protective strategy had the most negative impact on recipients’ thoughts (M = -.62), compared to the cultural-religious (M = -.36) and existential (M = -.30) strategies, F(2,209) = 3.24, p<.05. Tukey’s post hoc test revealed a significant difference between the protective and existential strategies (p<.05). Forewarning also had a significant effect on message thoughts. Individuals who were warned generated more unfavorable thoughts (M = -.52) than those unwarned (M = -.28), F(1,210) = 4.74, p<.05. In all cases, the thought indexes were negative, which reflects negative opinions by recipients about presented messages.

The critical question with regard to message thoughts was whether there are differences between the impact of particular strategies under warned and unwarned conditions (Figure 1). To answer this question, we conducted further ANOVA analyses.



Figure 1. Post-message thoughts index as a function of Persuasive Strategies and Forewarning

Regarding the existential strategy, subjects in the warned condition exhibited more negative thoughts (M = -.42) compared with those in the unwarned condition (M = -.06), F(1,69) = 13.82, p<.01. In contrast, in the cultural-religious strategy, there was no significant difference between warned (M = -.30) and unwarned (M = -.43) participants, F(1,68) = 1.12; p<.27. However, using the protective strategy generated more unfavorable thoughts (M = -.79) in warned subjects than in those who were unwarned (M = -.43), F(1,69) = 12.47, p<.01). The results show that there are differences in persuasive influence among the strategies used under warned and unwarned conditions.
Attitudes

Attitude toward the message. In the beginning, a Strategy x Forewarning analysis of variance on the message attitudes was conducted (3 x 2 ANOVA). Consistent with the thoughts index results, the Strategy x Forewarning interaction was not significant for this attitude F(2,206) = .28, p<.75. The primary significant effect of Strategy revealed that participants who listened to the protective strategy had more negative attitudes (M = 3.1) than those in the cultural-religious (M = 3.79) and existential (M = 3.91) strategy groups, F(2,209) = 4.06, p<.05. These results correspond with the previous analyses we conducted on message thoughts and confirm their methodological coherence. Tukey’s post hoc test showed two significant differences: between the protective and existential strategies (p<.01) and between the protective and cultural-religious strategies (p<.05). There was no difference between the existential and cultural-religious strategies. The second significant effect was for Forewarning, with warned subjects showing more negative attitudes (M = 3.21) than unwarned ones (M = 4.01), F(1,210) = 11.73, p<.01.

Strategies appeared to influence subjects’ attitudes, depending on which strategy was employed and the warned vs. unwarned conditions (Figure 2). Then we examined the impact in relation to each strategy.

Participants who were listening to the existential strategy and were warned showed more negative attitudes (M = 3.49) compared with those who were unwarned (M = 4.34), F(1,69) = 11.34, p<.01. In the religious-cultural strategy group, warned participants presented more negative attitudes (M = 3.47) than those who were unwarned (M = 4.09), F(1,68) = 8.89; p<.05. The protective strategy had a similar effect on subjects’ attitude—e.g., in the warned condition they had more negative attitudes (M = 2.64) than in the unwarned condition (M = 3.6), F(1,69) = 13.13, p<.01).

Figure 2. Post-message attitudes toward the message as a function of Persuasive Strategies and Forewarning

Attitude toward the speaker. A Strategy x Forewarning analysis of variance revealed significant interaction between both factors, F(2,206) = 3,13; p<0,05. Strategy had a significant main effect on attitude toward the speaker, leading to more negative attitudes under the protective strategy (M = 3.31) than under the existential (M = 3.70) and cultural-religious (M = 3.79) strategies, F(2,209) = 3.16, p<.05. Tukey’s post hoc test revealed a significant difference between the effect of the protective and existential strategies (p<.05). Forewarning also had a significant main effect on recipients’ attitudes, with warned participants showing more negative attitudes (M = 3.31) than the unwarned participants (M = 3.98), F(1,210) = 14.71, p<.001.

Next, we examined the strategies’ influence on attitudes toward the speaker under warned vs. unwarned conditions. The graphic illustration of the influence is presented in Figure 3.

Results revealed that the existential strategy led to more negative attitudes among warned subjects (M = 3.16) than among the unwarned group (M = 4.25), F(1,69) = 7.46, p<.05. Similarly, in the case of the protective strategy, warned subjects reflected more negative attitudes (M = 2.87) than unwarned subjects did (M = 3.75), F(1,68) = 5.19, p<.05. There was no significant difference between warned (M = 3.73) and unwarned subjects (M = 3.83) in the cultural-religious strategy group, F(1,68) = 0.87, p<.60.



Figure 3. Post-message attitudes toward the speaker as a function of Persuasive Strategies and Forewarning
Discussion and Conclusions

The results of the current study suggest that persuasive strategies make a strong impact on recipients’ thoughts and attitudes. The impact is mediated by the process of revealing or hiding the group’s identity, which can be considered in terms of forewarning. To attract new members, cultic groups apply various strategies that lead to different effects on recipients’ attitudes. Because of many similarities between the applied strategies and persuasion, the range of the effects can be measured using methods drawn from studies on persuasive communication.

The three strategies used in the experiment influenced the recipients’ cognitive processes and attitudes in a diverse way. We found the most negative effects on participants when the protective strategy was used; this strategy led to the highest levels of unfavorable thoughts and most negative attitudes toward the message and speaker. The least negative effects occurred when we presented the existential strategy to the individuals. This strategy leads to participants’ weakest negative impressions about the group and its speaker. From a psychological point of view, this strategy is most influential when applied by cultic groups because the individuals are not discouraged from considering the groups’ ideas and teachings. The question then arises: Why? The existential strategy contains arguments that reflect a universal search for meaning of life and that help people to resolve daily problems and difficulties. In that sense, the strategy meets psychological needs, which makes people exposed to this kind of information susceptible to cultic groups’ persuasion. The existential strategy refers to attitudes that serve the value-expressive function, thus providing a means for expressing personal values and other core aspects of the self-concept. Drawing on the message-matching approach, Petty, Wheeler, and Bizer (2000) suggest that the most important of the underlying factors in persuasion is the one related to the self. Messages that involve issues related to the self (e.g., the existential strategy) are likely to influence recipients’ attitudes.

Another explanation of why the existential strategy led to the weakest negative effects is connected to value-relevant involvement. When an attitude is strongly linked to the self and is serving a value-expressive function, the individual is more resistant to a counterattitudinal message and more receptive to a proattitudinal message (Levin, Nichols, & Johnson, 2000). To protect their self, people tend to avoid inconsistency in their attitudes and belief systems by selectively ignoring or minimizing information that is against their beliefs. The existential message used in the current experiment had a counterattitudinal character that caused the recipients to express more resistance.

Forewarning used in the form of revealing the group’s identity led the individuals to have more negative cognitive processes and attitudes toward the message and speaker. Thus, the forewarning decreased the persuasive impact of the speaker. This result is consistent with other results obtained in persuasion research (Benoit, 1998; Zuwerink-Jacks & Devine, 2000). When message recipients are aware of the speaker’s identity in advance, they realize that the speaker is deliberately intending to persuade them. As a consequence, they will be more cautious with the speaker’s message and more critical toward the content. Another explanation can assume that revealing the group’s identity may mean the person already has a negative attitude to the group rather than any real knowledge of how the group functions or its potential for harm. Both interpretations explain why members of cultic groups, at the beginning of their presentation, tend to hide their true identity and membership so as not to discourage people from listening to the message. Hiding their true identity facilitates their credibility and strengthens persuasive influence. The recipients are likely to evaluate the message in more positive terms if they do not know that it reflects opinions of a cultic group. Hiding one’s membership is a clear example of manipulation aimed at deceiving potential followers.

Perhaps the most interesting finding of the current experiment is that forewarning has the strongest effect in the case of the existential and protective strategies, but forewarning does not have any impact on cognitive responses and attitudes when the cultural-religious strategy is used. The interaction between persuasive strategies and forewarning in attitudes toward the speaker shows that the impact of forewarning on the recipients’ impressions depends on specific strategies. When the speaker is talking about issues related to cultural and religious topics, his/her identity does not matter too much. What is most important for the listeners is the message. The situation changes when the speaker starts his/her presentation with existential issues such as meaning of life, human needs, or moves toward solutions to current dangers and fears. Then, the identity plays an important role and influences the recipients’ attitudes. The recipients become motivated to defend their position and exert more effort in precisely scrutinizing the message to decide whether it is worth accepting. What happens may be a result of objective processing of the message when people are motivated because of their personal interest in the topic (Petty et al., 2005). They will carefully appraise the extent to which the message provides information that is important or central to their needs.

This finding can have practical implications for members of cultic groups during their first contact with people they try to persuade. If the members are asked about their group identity, which can have negative connotations, they will try to reduce this negative impression by applying the cultural-religious strategy and talking about general issues such as universal aspects of the world, the beginning of life on the earth, great religious books, and mysteries of the world. At the same time, the members will avoid answering any questions regarding daily problems or potential dangers (the existential and protective strategies), because they realize that doing so can lead to negative associations with their group. They may use these strategies later once they have gained the people’s trust. This technique can be considered an example of manipulation and have far-reaching consequences. Langone (2001) notes that psychological manipulation can pose a danger and lead to changes in people’s thinking and behavior. Awareness of these processes can help people approached by members of cultic groups to maintain their independent thinking and minimize the risk of being deceived.

There are several limitations to the results we obtained in our study. Because of the study’s experimental character, it is difficult to assess to what extent the responses and judgments of groups of students to a set of laboratory situations can be generalized to the experience of potential cult recruits. This is a common problem present in this field of study on interpersonal influence (Cialdini & Sagarin, 2005). The results seem to reflect general cognitive and emotional processes that may happen during persuasive influence in the context of cultic groups. Although people’s responses and opinions can be modified, depending on a specific situation, the general pattern is likely to remain relatively stable. When exposed to various persuasive strategies, individuals will be more influenced by the method that matches their existential problems.

Another question is to what extent the effects found are due to the conditions included in the design and are not caused by other factors. The condition of forewarning may bring to the individuals’ minds their implicit knowledge about cults and their potentially manipulative influence. In that case, previous experience with cultic groups and received information about cults may cause the individuals to defend their initial positions. Further studies could assess people’s opinions and attitudes about cultic groups before the persuasive messages are presented. This approach would shed more light on the relationship between initial attitudes and cognitive responses. In addition to relying on their content, persuasive strategies can also use other psychological mechanisms which would make people change their attitudes. The potential examples of such mechanisms include altering one’s thinking by distractions or inducing specific emotions that can change the nature of cognitive processing.

The findings of our experiment call for further examination of the psychological mechanisms that underlie the effects of persuasive strategies cultic groups use. Certain messages cultic groups present should be further investigated because they are not clear, or it is not clear that certain techniques a group uses can have a harmful impact. Potential areas of interest can encompass such issues as individual susceptibility to cultic groups’ influence, persuasive processes in strategies, personality traits of people who decide to join a cultic group, and so on. Such investigation will be beneficial for both researchers and practitioners. The first will gain a better understanding of the factors involved in persuasive influence and the mechanisms responsible for attitude changes. The latter will receive practical tools that enable them to work with people who have been affected by cultic groups’ manipulation. Given the presence of cultic groups in contemporary societies, this endeavour appears worthwhile.
References

Abgrall, J. M. (2005). Sekty. Manipulacja psychologiczna [Sects. Psychological manipulation]. Gdańsk: Gdańskie Wydawnictwo Psychologiczne.

Amodio, D. M., & Devine, P. G. (2005). Changing prejudice: The effects of persuasion on implicit and explicit forms of race bias. In T. C. Brock & M. C. Green (Eds.), Persuasion: Psychological insights and perspectives (pp. 249–280). Thousand Oaks: Sage. –

Aronoff, J., Lynn, S. J., & Malinoski, P. (2000). Are cultic environments psychologically harmful? Clinical Psychology Review, 20, 1, 91–111.

Baron, R. A., & Byrne, D. (2004). Social psychology. Boston: Pearson Education.

Benoit, W. L. (1998). Forewarning and persuasion. In M. Allen & R. Priess (Eds.), Persuasion: Advances through meta-analysis (pp. 159–184). Cresskill, NJ: Hampton Press.

Bizer, G. Y., & Petty, R. E. (2005). How we conceptualize our attitudes matters: The effects of valence framing on the resistance of political attitudes. Political Psychology, 26, 553–568.

Blickle, G. (2003). Some outcomes of pressure, ingratiation, and rational persuasion used with peers in the workplace. Journal of Applied Social Psychology, 33, 3, 648–666.

Bohner, G., & Wanke, M. (2004). Attitudes and attitude change. Hove: Psychology Press.

Briñol, P., & Petty, R. E. (2006). Fundamental processes leading to attitude change: Implications for cancer prevention communications. Journal of Communication, 56, 81–104.

Bukalski, S. (2006). Podatność młodzieży na oddziaływania grup kultowych [Vulnerability of the youth to influence of cult groups]. Szczecin: Wydawnictwo Naukowe Uniwersytetu Szczecińskiego.

Cialdini, R. B. (1993). Influence: Science and practice: New York: Harper Collins.

Cialdini, R. B. (2004). The science of persuasion. Scientific American Special Edition, 14, 1, 70–77.

Cialdini, R. B., & Petty, R. E. (1981). Anticipatory opinion effects. In R. E. Petty, T. M. Ostrom, & T. C. Brock (Eds.), Cognitive responses in persuasion (pp. 217–235). Hillsdale, NJ: Erlbaum.

Cialdini, R. B., & Sagarin, B. J. (2005). Principles of interpersonal influence. In T. C. Brock & M. C. Green (Eds.), Persuasion: Psychological insights and perspectives (pp. 143–169). Thousand Oaks: Sage.

Crano, W. D., & Prislin, R. (2006). Attitudes and persuasion. Annual Review of Psychology, 57, 345–374.

Dotson, M. J., & Hyatt, E. M. (2000). Religious symbols as peripheral cues in advertising: A replication of the elaboration likelihood model. Journal of Business Research, 48, 1, 63–68.

Eagly, A. H., Chaiken, S. (1998). Attitude structure and function. In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), The handbook of social psychology, (Vol. 1, pp. 269–322). New York: McGraw-Hill.

Erwin, P. (2001). Attitudes and persuasion. Hove: Psychology Press.

Gordon, R. A. (1996). Impact of ingratiation on judgments and evaluations: A meta-analytic investigation. Journal of Personality and Social Psychology, 71, 1, 54–70.

Hargie, O., & Dickson, D. (2004). Skilled interpersonal communication. London and New York: Routledge.

Hogg, M. A., & Vaughan, G. M. (2005). Social psychology. Harlow: Pearson.

Kardes, F. R. (2005). The psychology of advertising. In T. C. Brock & M. C. Green (Eds.), Persuasion: psychological insights and perspectives (p. 281–303). Thousands Oaks: Sage Publications.

Kiesler, C., & Kiesler, S. (1964). Role of forewarning in persuasion communication. Journal of Abnormal and Social Psychology, 68, 547–549.

Krok, D. (2005). Perswazja w przekazie religijno-moralnym [Persuasion in religious and moral communication]. Opole: Redakcja Naukowa WT UO.

Kropveld, M., & Pelland, M. -A. (2006). The cult phenomenon. Québec: Info-Cult Publisher.

Lalich, J. (2004). Bounded choice: True believers and charismatic cults. California: University of California Press.

Langone, M. D. (2001). Cults, psychological manipulation, and society: International perspectives – an overview. Cultic Studies Journal, 18, 1–12.

Lavine, H., & Snyder, M. (1996). Cognitive processing and the functional matching effect in persuasion: The mediating role of subjective perceptions of message quality. Journal of Experimental Social Psychology, 32, 580–604.

Levin, D. K., Nichols, D. R., & Johnson, B. T. (2000). Involvement and persuasion: Attitude functions for the motivated processor. In G. Maio & J. Olson (Eds.), Why we evaluate: Functions of attitudes (pp. 163–194). Mahwah, NJ: Lawrence Erlbaum.

Marsh, K. L., & Julka, D. L. (2000). A motivational approach to experimental tests of attitude functions theory. In G. Maio & J. Olson (Eds.), Why we evaluate: Functions of attitudes (pp. 271–294). Mahwah, NJ: Lawrence Erlbaum.

Nowakowski, P. (1999). Sekty. Co każdy powinien wiedzieć [Sects. What does everyone need to know]. Tychy: Maternus Media.

Petty, R. E. & Cacioppo, J. T. (1986). The Elaboration Likelihood Model of persuasion. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 19, p. 123–205). New York: Academic Press.

Petty, R. E., & Cacioppo, J. T. (1979). Effects of forewarning of persuasive intent and involvement on cognitive responses and persuasion. Personality and Social Psychology Bulletin, 5, 173-176.

Petty, R. E., Cacioppo, J. T., Strathman, A. J., & Priester, J. R. (2005). To think or not to think: Exploring two routes to persuasion. In T. C. Brock & M. C. Green (Eds.), Persuasion: psychological insights and perspectives (p. 81–116). Thousands Oaks: Sage Publications.

Petty, R. E., Rucker, D., Bizer, G., & Cacioppo, J. T. (2004). The elaboration likelihood model. In J. S. Seiter & G. H. Gass (Eds.), Perspectives on persuasion, social influence and compliance gaining (p. 65–89). Boston: Allyn and Bacon.

Petty, R. E., Wheeler, S. C., & Bizer, G. Y. (2000). Matching effects in persuasion: An elaboration likelihood analysis. In G. Maio & J. Olson (Eds.), Why we evaluate: Functions of attitudes (pp. 133–162). Mahwah, NJ: Lawrence Erlbaum.

Rucker, D. D., & Petty, R. E. (2006). Increasing the effectiveness of communications to consumers: Recommendations based on the Elaboration Likelihood and attitude certainty perspectives. Journal of Public Policy and Marketing, 25, 1, 39–52.

Wood, W., & Quinn, J. M. (2003). Forewarned and forearmed? Two meta-analytic syntheses of forewarning of influence appeals. Psychological Bulletin, 129, 1, 119–138.

Zablocki, B. (1997). The blacklisting of a concept: The strange history of the brainwashing conjecture in the sociology of religion. Nova Religio: The Journal of Alternative and Emergent Religions, 1, 1, 96–121.

Zuwerink-Jacks, J., & Devine P. G. (2000). Attitude importance, forewarning of message content, and resistance to persuasion. Basic and Applied Social Psychology, 22, 19–29.
About the Author

Dariusz Krok, Ph.D., M.A. in theology, Ph.D. in psychology. He received undergraduate, graduate, and postgraduate education in Psychology at the Catholic University of Lublin, and in theology at the Opole University, Poland. He is currently working as Assistant Professor at the Opole University. His primary areas of research cover the domains of psychology of religion and social psychology. Within these areas, he has conducted research analyzing processes and implications of persuasion and attitude change. He is author and co-editor the following books: Perswazja w przekazie religijno-moralnym [Persuasion in religious and moral communication], Język przekazu religijnego [Language of religious communication], Psychologiczny wymiar cierpienia [Psychological dimension of suffering]. A great deal of his current work explores the role of religious beliefs and relations between religiousness and personality. He has also worked with ex-members of cultic groups providing counseling and psychological support.


Cultic Studies Review, Vol. 8, No. 1, 2009, Page


[1] Two of Abgrall‘s propositions are that cultural and religious strategy and ideological strategy are very similar and are used simultaneously in practice; so it was decided to combine both approaches and create one strategy, entitled cultural-religious strategy, that encompasses both types.