6 Attitudes
This chapter examines social influences on attitudes. We define attitudes and
then discuss how they are measured and when they are related to behavior.
Then we consider two methods of changing attitudes. First, we look at source,
message, and audience factors that win persuasion through the media of
communication. Second, we consider theories and research showing that
people often change their attitudes as a consequence of their own actions.
The Study of Attitudes | 210
•
•
•
How Attitudes Are Measured
How Attitudes Are Formed
The Link Between Attitudes and Behavior
Persuasion by Communication | 223
•
Two Routes to Persuasion
•
•
•
•
The Source
The Message
The Audience
Culture and Persuasion
Persuasion by Our Own Actions | 245
•
•
•
•
•
•
Role Playing: All the World's a Stage
Cognitive Dissonance Theory: The Classic Version
Cognitive Dissonance Theory: A New Look
Alternative Routes to Self-Persuasion
Ethical Dissonance
Cultural Influences on Cognitive Dissonance
Changing Attitudes | 259
Review
ISIS. Taxes. Same-sex marriage. The death penalty. Gun control. Israelis and
Palestinians. Abortion. Climate change. Immigration. Anyone who has followed recent
events in the United States—or anywhere else in the world, for that matter—knows how
passionately people feel about the issues of the day. Attitudes and the mechanisms of
attitude change, or persuasion, are a vital part of human social life. This chapter addresses
three sets of questions: (1) What is an attitude, how can it be measured, and what is its link
to behavior? (2) What kinds of persuasive messages lead people to change their attitudes?
(3) Why do we often change our attitudes as a result of our own actions?
The Study of Attitudes
Are you a Democrat, Republican, or Independent? Should marijuana be legalized? Would you
rather listen to alternative rock, country-western, or hip-hop? Do you prefer drinking Coke or
Pepsi, water, or fruit juice? Do you have an iPhone or an Android? Should terrorism be
contained by war or conciliation?
As these questions suggest, each of us has positive and negative reactions to various persons,
objects, and ideas. These reactions are called attitudes. Skim the chapters in this book, and you'll
see just how pervasive attitudes are. You'll see, for example, that self-esteem is an attitude we
hold about ourselves, that attraction is a positive attitude toward another person, and that
prejudice is a negative attitude often directed against certain groups. Indeed, the study of
attitudes—what they are, where they come from, how they can be measured, what causes them to
change, and how they interact with behavior—is central to the whole field of social psychology
(Bohner & Dickel, 2011; Crano & Prislin, 2014; Perloff, 2010; Maio & Haddock, 2015).
attitude A positive, negative, or mixed reaction to a person, object, or idea.
Putting COMMON SENSE to the Test
Circle Your Answer
T F Researchers can tell if someone has a positive or negative attitude by measuring
physiological arousal.
T F In reacting to persuasive communications, people are influenced more by superficial
images than by logical arguments.
T F People are most easily persuaded by commercial messages that are presented without their
awareness.
T F The more money you pay people to tell a lie, the more they will come to believe it.
T F People often come to like what they suffer for.
An attitude is a positive, negative, or mixed evaluation of an object that is expressed at some
level of intensity—nothing more, nothing less. Like, love, dislike, hate, admire, and detest are the
kinds of words that people use to describe their attitudes. It's important to realize that attitudes
cannot simply be represented along a single continuum ranging from wholly positive to wholly
negative—as you might expect if attitudes were like the volume button on a remote control unit
or the lever on a thermostat that raises or lowers temperature. Rather, as depicted in • Figure 6.1,
our attitudes can vary in strength along both positive and negative dimensions. In other words,
we can react to something with positive affect, with negative affect, with ambivalence (mixed
emotions), or with apathy and indifference (Cacioppo et al., 1997). Some people more than
others are troubled by this type of inconsistency (Newby-Clark et al., 2002). In fact, at times
people have both positive and negative reactions to the same attitude object without feeling
conflict because they are conscious of one reaction but not the other. Someone who is welcoming
toward racial minorities but harbors unconscious prejudice is a case in point (Wilson et al.,
2000).
Each and every one of us routinely forms positive and negative evaluations of the people,
places, objects, and ideas we encounter. We like some things but not others. This attitude
formation process is often quick, automatic, and “implicit”—much like a reflex action (Bargh et
al., 1996; De Houwer, 2014; Ferguson, 2007).
• FlGURE 6. 1 Four Possible Reactions to Attitude Objects
As shown, people evaluate objects along both positive and negative dimensions. As a result, our
attitudes can be positive, negative, ambivalent, or indifferent.
You might assume that a person's attitude represents a unique relation between that person and
a specific attitude object. There are two ways, however, in which our attitudes reveal a lot about
us as individuals. First, people differ in terms of their tendency in general to like or dislike
things. Consider an array of very different and unrelated attitude objects: How do you feel about
bicycles? What about crossword puzzles, camping, Japan, taxes, politics, architecture, chess,
statistics, Hulu, Netflix, religion, and bottled water? Curious as to whether people have
tendencies in general to like or dislike things, which they called dispositional attitudes, Justin
Hepler and Dolores Albarracín (2013) found that when they asked research participants to rate
how much they liked or disliked a long list of unrelated things, some individuals on average
tended to report positive attitudes; others on average were negative (also see Eschleman et al.,
2015).
A second way in which our attitudes reveal something about us as individuals is that people
differ not only in terms of whether they tend to like or dislike things but in the extent to which
how quickly and how strongly they react. Think about yourself. Do you form opinions easily?
Do you tend to have strong likes and dislikes? Or do you tend to react in more guarded, less
effusive ways? Individuals who describe themselves as high rather than low in the need for
evaluation are more likely to view their daily experiences in highly judgmental terms and they
are more opinionated—positive and negative—on a whole range of social, moral, and political
issues (Bizer et al., 2004; Jarvis & Petty, 1996).
Before we examine the elusive science of attitude measurement, let's stop for a moment and
ponder this question: Why do human beings bother to form and have attitudes? Does forming a
positive or negative judgment of people, objects, and ideas serve any useful purpose? Over the
years, researchers have found that attitudes serve important functions—such as enabling us to
judge quickly and without much thought whether something we first encounter is good or bad,
helpful or hurtful, and to be sought or avoided (Maio & Olson, 2000). The downside is that
having preexisting attitudes can lead us to be closed-minded, bias how we interpret new
information, and make us more resistant to change. For example, Russell Fazio and others (2000)
found that people who were focused on their positive or negative attitudes toward computerized
faces, compared to those who were not, were later slower to notice when the faces were
“morphed” and no longer the same.
▪ How Attitudes Are Measured
In 1928, Louis Thurstone published an article entitled “Attitudes Can Be Measured.” What
Thurstone failed to fully anticipate, however, is that attitude measurement is tricky business. At
one point, several years ago, over 500 different methods were available to determine an
individual's attitudes (Fishbein & Ajzen, 1972).
Self-Report Measures The easiest way to assess a person's attitude about something is to ask.
All over the world, public opinions are recorded on a range of issues in politics, the economy,
health care, foreign affairs, science and technology, sports, entertainment, religion, and lifestyles.
Simply by asking people, public opinion surveys conducted by Harris, Gallup, Pew Research
Center, and other polling organizations have recently revealed that 33% of Americans will never
consider buying or leasing a self-driving car; 82 % believe that childhood vaccinations should be
mandatory; 74% would delete themselves from Internet search results if they could; 70% do not
“feel engaged or inspired at their jobs;” and 88% trust online reviews of products and services as
much as personal recommendations. Still other surveys have shown that Americans prefer to
watch football over baseball and like to eat chocolate ice cream more than vanilla and other
flavors; that more than half say they surf the Internet while watching TV; that one in five adults
has a tattoo; and that Christmas is America's favorite holiday, followed by Thanksgiving and
Halloween.
As in this January 7, 2014 unity rally in Paris, where 3 million gathered to pay respect for the
victims of a terrorist attack on the Charlie Hebdo magazine, people can be very passionate about
the attitudes they hold. Their message here: “Not Afraid.” On November 13, 2015, not quite two
years later, Paris was once again rocked by a cluster of heinous terrorist attacks.
Self-report measures are direct and straightforward. But attitudes are sometimes too complex to
be measured by a single question. As you may recall from Chapter 2, one problem recognized
by public opinion pollsters is that responses to attitude questions can be influenced by their
wording, the order and context in which they are asked, and other extraneous factors (Sudman et
al., 2010; Tourangeau et al., 2000). In one survey, the National Opinion Research Center asked
hundreds of Americans if the U.S. government spent too little money on “assistance to the poor”
and 65 % said yes. Yet when the same question was asked using the word “welfare” instead,
only 20% said the government spent too little (Schneiderman, 2008). In a second survey,
researchers asked more than 2,000 registered voters about their belief in the phenomenon of
“global warming” or “climate change.” Democrats uniformly endorsed the proposition at a high
rate, but the number of Republicans who did so increased from 44% when asked about global
warming to 60% when asked about climate change (Schuldt et al., 2011).
To view new public opinion poll results online, you can
visit: http://www.harrisinteractive.com/, http://www.gallup.com/home.aspx,
and http://www.pewresearch.org/
Recognizing the shortcomings of single-question measures, survey researchers have developed
more sophisticated methods (Fowler, 2014). Often single questions are replaced by multipleitem questionnaires known as attitude scales (Robinson et al., 1991; 1998). Attitude scales
come in different forms, perhaps the most popular being the Likert Scale, named after its
inventor, Rensis Likert (1932). In this technique, respondents are presented with a list of
statements about an attitude object and are asked to indicate on a multiple-point scale how
strongly they agree or disagree with each statement. Each respondent's total attitude score is
derived by summing responses to all the items. However, regardless of whether attitudes are
measured by one question or a full-blown scale, the results should be taken with caution. All
self-report measures assume that people honestly express their true opinions. Sometimes this
assumption is reasonable and correct, but often it is not. Wanting to make a good impression,
people are often reluctant to admit to their failures, vices, weaknesses, unpopular opinions, and
prejudices.
attitude scale A multiple-item questionnaire designed to measure a person's attitude toward
some object.
One approach to this problem is to increase the accuracy of self-report measures. To get
respondents to answer attitude questions more truthfully, researchers have sometimes used
the bogus pipeline, an elaborate mechanical device that supposedly records our true feelings
physiologically, like a lie-detector test. Not wanting to get caught in a lie, respondents tend to
answer attitude questions more honestly, and with less positive spin, when they think that any
deception would be exposed by the bogus pipeline (Jones & Sigall, 1971; Roese & Jamieson,
1993). In one study, people were more likely to admit to drinking too much, using cocaine,
having frequent oral sex, and not exercising enough when the bogus pipeline was used than when
it was not (Tourangeau et al., 1997). In another study, adolescents were more likely to admit to
smoking when the bogus pipeline was used than when it was not (Adams et al., 2008).
bogus pipeline A phony lie-detector device that is sometimes used to get respondents to give
truthful answers to sensitive questions.
Pollsters and attitude researchers are well aware that self-reports cannot always be trusted,
especially on sensitive topics.
Covert Measures A second general approach to the self-report problem is to collect indirect,
covert measures of attitudes that cannot be controlled. One possibility in this regard is to use
observable behavior such as facial expressions, tone of voice, and body language. In one study,
Gary Wells and Richard Petty (1980) secretly videotaped college students as they listened to a
speech and noticed that when the speaker took a position that the students agreed with (that
tuition costs should be lowered), most made vertical head movements. But when the speaker
took a contrary position (that tuition costs should be raised), head movements were in a
horizontal direction. Without realizing it, the students had signaled their attitudes by nodding and
shaking their heads.
• FIGURE 6.2 The Facial EMG: A Covert Measure of
Attitudes?
The facial EMG makes it possible to detect differences between positive and negative attitudes.
Notice the major facial muscles and recording sites for electrodes. When people hear a message
with which they agree rather than disagree, there is a relative increase in EMG activity in the
depressor and zygomatic muscles but a relative decrease in corrugator and frontalis muscles.
These changes cannot be seen with the naked eye.
Although behavior provides clues, it is far from perfect as a measure of attitudes. Sometimes
we nod our heads because we agree; at other times, we nod to be polite. The problem is that
people monitor their overt behavior just as they monitor self-reports. But what about internal
physiological reactions that are difficult if not impossible to control? Does the body betray how
we feel? In the past, researchers tried to divine attitudes from involuntary physical reactions such
as perspiration, heart rate, and pupil dilation. The result, however, was always the same:
Measures of arousal reveal the intensity of one's attitude toward an object but not whether that
attitude itself is positive or negative. On the physiological record, love and hate look very much
the same (Petty & Cacioppo, 1983).
Although physiological arousal measures cannot distinguish between positive and negative
attitudes, some interesting alternatives have been discovered. One is the facial electromyograph
(EMG). As shown in • Figure 6.2, certain muscles in the face contract when we are happy and
different facial muscles contract when we are sad. Some of the muscular changes cannot be seen
with the naked eye, however, so the facial EMG is used. To determine whether the EMG can be
used to measure the affect associated with attitudes, John Cacioppo and Richard Petty (1981)
recorded facial muscle activity of college students as they listened to a message with which they
agreed or disagreed. The agreeable message increased activity in the cheek muscles—the facial
pattern that is characteristic of happiness. The disagreeable message sparked activity in the
forehead and brow area—the facial patterns that are associated with sadness and distress. Outside
observers who later watched the participants were unable to see these subtle changes.
Apparently, the muscles in the human face reveal smiles, frowns, feelings of disgust, and other
reactions to attitude objects that might otherwise be hidden from view (Cacioppo et al.,
1986; Tassinary & Cacioppo, 1992).
facial electromyograph (EMG) An electronic instrument that records facial muscle activity
associated with emotions and attitudes.
From a social neuroscience perspective, electrical activity in the brain may also assist in the
measure of attitudes. In 1929, Hans Burger invented a machine that could detect, amplify, and
record “waves” of electrical activity in the brain using electrodes pasted to the surface of the
scalp. The instrument is called an electroencephalograph, or EEG, and the information it
provides takes the form of line tracings called brain waves. Based on an earlier discovery that
certain patterns of electrical brain activity are triggered by exposure to stimuli that are novel or
unexpected, Cacioppo and others (1993) had participants list 10 items they liked and 10 they did
not like within various object categories (fruits, sports, movies, universities, etc.). Later, these
participants were brought into the laboratory, wired to an EEG, and presented with a list of
category words that depicted objects they liked and disliked. The result: Brain-wave patterns that
are normally triggered by inconsistency increased more when a disliked stimulus appeared after
a string of positive items or when a liked stimulus was shown after a string of negative items
than when either stimulus evoked the same attitude as the items that preceded it.
Today, social psychologists are also starting to use new forms of brain imaging in the
measurement of attitudes. In one study, researchers used fMRI to record brain activity in
participants as they read names of famous—and infamous—figures such as John F. Kennedy,
Bill Cosby, and Adolf Hitler. When the names were read, they observed in participants greater
activity in the amygdala, a brain structure associated with emotion, regardless of whether or not
participants were asked to evaluate the famous figures (Cunningham et al., 2003). In a study
focused on political attitudes, other researchers used fMRI to record brain activity in opinionated
men during the 2004 presidential election as they listened to positive and negative statements
about their preferred candidate. Although brain areas associated with cognitive reasoning were
unaffected during these presentations, activity increased in areas typically associated with
emotion (Westen et al., 2006). These studies suggest that people react automatically to positive
and negative attitude objects and that these attitudes may be measurable by electrical activity in
the brain.
Researchers can tell if someone has a positive or negative attitude by measuring
physiological arousal.
FALSE
The Implicit Association Test (IAT) When it comes to covert measurement, one particularly
interesting development is based on the notion that each of us has all sorts of implicit
attitudes that we cannot self-report in questionnaires because we are not even aware of having
these attitudes (Fazio & Olson, 2003). To measure these unconscious attitudes, a number of
indirect methods have been developed (De Houwer et al., 2009; Nosek et al., 2011; Payne &
Lundberg, 2014). The most popular is the Implicit Association Test (IAT) created by Anthony
Greenwald, Mahzarin Banaji, Brian Nosek, and others. As we saw in Chapter 5, the IAT
measures the sheer speed—in fractions of a second—with which people associate pairs of
concepts (Greenwald et al., 1998). To see how it works, visit the IAT website by logging onto
Project Implicit at https://implicit.harvard.edu/implicit/ or installing a free IAT app onto your
Android or iPhone. On the first page, you will read that “Project Implicit investigates thoughts
and feelings that exist outside of conscious awareness or conscious control.”
implicit attitude An attitude, such as prejudice, that one is not aware of having.
Implicit Association Test (IAT) A covert measure of unconscious attitudes derived from the
speed at which people respond to pairings of concepts—such as black or white with good or bad.
To take a test that measures your implicit racial attitudes, you go through a series of stages.
First, you are asked to categorize black or white faces as quickly as you can, for example, by
pressing a left-hand key in response to a black face and a right-hand key for a white face. Next,
you are asked to categorize a set of words, for example, by pressing a left-hand key for positive
words (love, laughter, friend) and a right-hand key for negative words (war, failure, evil). Once
you become familiar with the categorization task, the test combines faces and words. You may
be asked, for example, to press the left-hand key if you see a black face or positive word and a
right-hand key for a white face or negative word. Then, in the fourth stage, the opposite pairings
are presented—black or negative, white or positive. Black and white faces are then interspersed
in a quick sequence of trials, each time paired with a positive or negative word. In rapid-fire
succession, you have to press one key or another in response to stimulus pairs such as blackwonderful, black-failure, white-love, black-laughter, white-evil, white-awful, black-war,
and white-joy. As you work through the list, you may find that some pairings are harder and take
longer to respond to than others. In general, people are quicker to respond when liked faces are
paired with positive words and disliked faces are paired with negative words than the other way
around. Using the IAT, your implicit attitudes about African Americans can thus be detected by
the speed it takes you to respond to black-bad/white-good pairings relative to black-good/whitebad pairings. The test takes only 10–15 minutes to complete. When you're done, you receive the
results of your test and an explanation of what it means (see • Figure 6.3).
• FIGURE 6.3 The Implicit Association Test
Through a sequence of tasks, the IAT measures implicit racial attitudes toward, for example,
African Americans, by measuring how quickly people respond to black-bad/white-good word
pairings relative to black-good/white-badpairings. Most white Americans are quicker to respond
to the first type of pairings than to the second, which suggests that they do not as readily
connect black-good and white-bad.
From 1998 to the present, visitors to the IAT website completed millions of tests. In
questionnaires, interviews, public opinion polls, and Internet surveys, people don't tend to
express stereotypes, prejudices, or other unpopular attitudes. Yet on the IAT, respondents have
exhibited a marked implicit preference for self over other, white over black, young over old,
straight over gay, able over disabled, thin over obese, and the stereotype that links males with
careers and females with family (Greenwald et al., 2003; Nosek et al., 2002). Because more and
more researchers are using these kinds of indirect measures, social psychologists who study
attitudes find themselves in the midst of a debate over what IAT scores mean, how the implicit
attitudes revealed in the IAT are formed and then changed, how well these attitudes predict or
influence societally important behaviors, and how they differ from the more explicit attitudes
that we consciously hold and report (Blanton et al., 2009; Gawronski & Bodenhausen,
2006; Greenwald et al., 2015; Oswald et al., 2015; Petty et al., 2009).
Do implicit attitudes matter? Do millisecond differences in response times on a computerized
test really predict behavior in real-world settings of consequence? And what does it mean when
one's implicit and explicit attitudes clash? The importance of these questions cannot be
overstated. If the IAT reveals unconscious prejudices that people do not self-report, should
individuals be scrutinized in the laboratory for hidden motives underlying potentially unlawful
behaviors—as when a police officer shoots a black suspect, fearing that he or she is armed; as
when an employer hires a male applicant over a female applicant, citing his credentials as
opposed to an act of discrimination; or as when a jury chooses to convict a Latino defendant on
the basis of ambiguous evidence?
Eager to use the IAT for predicting social behaviors of consequence, some researchers have
speculated about the relevance of implicit attitudes in the domains of law (Lane et al., 2007) and
politics (Gawronski et al., 2015). But is their speculation justified? Some researchers are critical
of claims concerning the predictive validity of the IAT, citing the need for more and stronger
behavioral evidence (Blanton et al., 2009; Oswald et al., 2015). Based on a meta-analysis of
122 IAT studies involving 15,000 participants, Greenwald and others (2009) conceded that
people's implicit attitudes are generally less predictive of behavior than their explicit attitudes.
They also found, however, that IAT measures are better when it comes to socially sensitive
topics for which people often conceal or distort their self-reports. In a poignant illustration of this
point, one research team administered to a large group of psychiatric patients an IAT that
measured their implicit associations between self and suicide. Over the next six months, patients
appearing in the emergency room because of a suicide attempt had a stronger implicit association
between self and suicide than those who appeared with other types of psychiatric emergencies
(Nock et al., 2010).
▪ How Attitudes Are Formed
How did you become liberal or conservative in your political values? Why do you favor or
oppose same-sex marriage? What draws you toward or away from organized religion?
Are Attitudes Inherited? One hypothesis, first advanced by Abraham Tesser (1993), is that
strong likes and dislikes are rooted in our genetic makeup. Research shows that on some issues
the attitudes of identical twins are more similar than those of fraternal twins and that twins raised
apart are as similar to each other as those who are raised in the same home. This pattern of
evidence suggests that people may be predisposed to hold certain attitudes. Indeed, Tesser found
that when asked about attitudes for which there seems to be a predisposition (such as attitudes
toward sexual promiscuity, religion, and the death penalty), research participants were quicker to
respond and less likely to alter their views toward social norms.
Tesser speculated that individuals are disposed to hold certain strong attitudes as a result of
inborn physical, sensory, and cognitive skills, temperament, and personality traits. Other twin
studies, too, as well as complex comparisons within extended families, have supported the notion
that people differ in their attitudes toward a range of social and political issues in part because of
genetically rooted differences in their biological makeup (Hatemi et al., 2010; Olson et al.,
2001). There are different ways in which a person's inborn tendencies may influence social and
political attitudes. In one study, for example, researchers brought 40 adults with strong political
views into the lab for testing and found that those who physiologically were highly reactive to
sudden noise and other unpleasant stimuli were more likely to favor capital punishment, the right
to bear arms, defense spending, and other policies seen as protective against domestic and
foreign threats (Oxley et al., 2008).
Are Attitudes Learned? Whatever dispositions nature provides to us, our most cherished
attitudes often form as a result of our exposure to attitude objects; our history of rewards and
punishments; the attitudes that our parents, friends, and enemies express; the social and cultural
context in which we live; and other types of experiences. In a classic naturalistic study, Theodore
Newcomb (1943) surveyed the political attitudes of students at Bennington College in Vermont.
At the time, Bennington was a women's college that drew its students from conservative and
mostly affluent families. Once there, however, the students encountered professors and older
peers who held more liberal views. Newcomb found that as the women moved from their first
year to graduation, they became progressively more liberal. (In the 1936 presidential election,
62% of first-year Bennington students preferred the Republican Landon to the Democrat
Roosevelt, compared to only 43 % of sophomores and 15% of juniors and seniors.) This link
between cultural environment and attitudes is particularly evident in the current political
landscape of America—a “house divided” into red states and blue states by geography, culture,
and ideology (Seyle & Newman, 2006).
Chances are, these identical twins have more in common than being truck driving partners.
Research suggests that people may also be genetically predisposed to hold certain attitudes.
Clearly, attitudes are formed through basic processes of learning. For example, a number of
studies have shown that people can form strong positive and negative attitudes toward neutral
objects that somehow are linked to emotionally charged stimuli. At the start of the twentieth
century, Russian physiologist Ivan Pavlov (1927) discovered that dogs would naturally and
reflexively salivate in response to food in the mouth. He then discovered that by repeatedly
ringing a bell—a neutral stimulus—before the food was placed in the mouth, the dog would
eventually start to salivate at the sound of the bell itself. This process by which organisms learn
to associate a once neutral stimulus with an inherently positive or negative response is a basic
and powerful form of learning. In fact, Pavlov found that once a dog was conditioned to salivate
to one tone, it went to “generalize” by responding to other sounds that were similar but not
identical.
It is now clear that this form of learning can help to explain the development of social attitudes.
In a classic first study, college students were presented with a list of adjectives that indicate
nationality (German, Swedish, Dutch, Italian, French, and Greek), each of which was repeatedly
presented with words that were known to have very pleasant (happy, gift, sacred) or unpleasant
(bitter, ugly, failure) connotations. When the participants later evaluated the nationalities by
name, they were more positive in their ratings of those that had been paired with pleasant words
than with unpleasant words (Staats & Staats, 1958). And as discovered by Pavlov nearly a
hundred years ago, research shows that when an attitude is changed toward one object, attitudes
toward similar and related objects are often changed as well (Glaser et al., 2015).
More recent studies of evaluative conditioning have shown that implicit and explicit attitudes
toward neutral objects can form by their association with positive and negative stimuli, even in
people who are not conscious of this association (Hofmann et al., 2010; Olson & Fazio,
2001; Sweldens et al., 2014; Walther et al., 2011). That's why political leaders all over the
world wrap themselves in a national flag to derive a benefit from positive associations, while
advertisers strategically pair their products with sexy models, uplifting music, celebrities,
nostalgic images, and other positive emotional symbols. In a series of laboratory studies,
researchers found that people came to prefer brands of a consumer product that were paired with
humorous ads more than those that were associated with nonhumorous ads (Strick et al., 2009).
evaluative conditioning The process by which we form an attitude toward a neutral stimulus
because of its association with a positive or negative person, place, or thing.
▪ The Link Between Attitudes and Behavior
People take for granted the notion that attitudes influence behavior. We assume that voters'
opinions of opposing candidates predict the decisions they will make on Election Day, that
consumer preference for one product over competing products will influence the purchases they
make, and that feelings of prejudice will trigger bad acts of discrimination. Yet as sensible as
these assumptions seem, the link between attitudes and behavior has proved far from perfect.
Sociologist Richard LaPiere (1934) was the first to notice that attitudes and behavior don't
always go hand in hand. In the 1930s, during the height of the Great Depression, LaPiere took a
young Chinese American couple on a three-month, 10,000-mile car trip, visiting 250 restaurants,
campgrounds, and hotels across the United States. Although prejudice against Asians was
widespread at the time, the couple was refused service only once. Yet when LaPiere wrote back
to the places they had visited and asked if they would accept Chinese patrons, more than 90 % of
those who returned an answer said they would not. Self-reported attitudes did not correspond
with behavior.
This study was provocative but seriously flawed. LaPiere measured attitudes several months
after his trip, and during that time the attitudes may have changed. He also did not know whether
those who responded to his letter were the same people who had greeted the couple in person. It
was even possible that the Chinese couple were served wherever they went because they were
accompanied by LaPiere—or because businesses were desperate during hard economic times.
Despite these limitations, LaPiere's study was the first of many to reveal a lack of
correspondence between attitudes and behavior. In 1969, Allan Wicker reviewed the applicable
research and concluded that attitudes and behavior are correlated only weakly, if at all. Sobered
by this conclusion, researchers were puzzled: Could it be that the votes we cast do not follow
from our political opinions, that consumers' purchases are not based on their attitudes toward a
product, or that discrimination is not related to underlying prejudice? Is the study of attitudes
useless to those interested in human social behavior? Not at all. During subsequent years,
researchers went on to identify the conditions under which attitudes and behavior are correlated.
Thus, when Stephen Kraus (1995) meta-analyzed all of this research, he concluded that
“attitudes significantly and substantially predict future behavior” (p. 58). In fact, he calculated
that there would have to be 60,983 new studies reporting a zero correlation before this
conclusion would have to be revised. Based on a meta-analysis of 41 additional studies, Laura
Glasman and Dolores Albarracín (2006) went on to identify some of the conditions under which
attitudes most clearly predict future behavior.
Attitudes in Context One important condition is the level of correspondence, or similarity,
between attitude measures and behavior. Perhaps the reason that LaPiere (1934) did not find a
correlation between self-reported prejudice and discrimination was that he had asked proprietors
about Asians in general but then observed their actions toward only one couple. To predict a
single act of discrimination, he should have measured people's more specific attitudes toward a
young, well-dressed, attractive Chinese couple accompanied by an American professor.
Icek Ajzen and Martin Fishbein (1977) analyzed more than 100 studies and found that attitudes
correlate with behavior only when attitude measures closely match the behavior in question.
Illustrating the point, Andrew Davidson and James Jaccard (1979) tried to use attitudes to predict
whether women would use birth control pills within the next two years. Attitudes were measured
in a series of questions ranging from very general (“How do you feel about birth control?”) to
very specific (“How do you feel about using birth control pills during the next two years?”). The
more specific the initial attitude question was, the better it predicted the behavior. Other
researchers as well have replicated this finding (Kraus, 1995).
The link between our feelings and our actions should also be placed within a broader context.
Attitudes are one determinant of social behavior, but there are other determinants as well. This
limitation formed the basis for Fishbein's (1980) theory of reasoned action, which Ajzen (1991)
later expanded into his theory of planned behavior. According to these theories, our attitudes
influence our behavior through a process of deliberate decision making, and their impact is
limited in four respects (see • Figure 6.4).
theory of planned behavior The theory that attitudes toward a specific behavior combine with
subjective norms and perceived control to influence a person's actions.
• FIGURE 6.4 Theory of Planned Behavior
According to the theory of planned behavior, attitudes toward a specific behavior combine with
subjective norms and perceived behavior control to influence a person's intentions. These
intentions, in turn, guide but do not completely determine behavior. This theory places the link
between attitudes and behavior within a broader context.
First, as just described, behavior is influenced less by general attitudes than by attitudes toward
a specific behavior. Second, behavior is influenced not only by attitudes but by subjective
norms—our beliefs about what others think we should do. As we'll see in Chapter 7, social
pressures to conform often lead us to behave in ways that are at odds with our inner convictions.
Third, according to Ajzen, attitudes give rise to behavior only when we perceive the behavior to
be within our control. To the extent that people lack confidence in their ability to engage in some
behavior, they are unlikely to form an intention to do so. Fourth, although attitudes (along with
subjective norms and perceived control) contribute to an intention to behave in a particular
manner, people often do not or cannot follow through on their intentions.
A good deal of research now supports the theories of reasoned action and planned behavior
(Ajzen & Fishbein, 2005). Indeed, this general approach, which places the link between
attitudes and behaviors within a broader context, has successfully been used to predict a wide
range of practical behaviors—such as using condoms, obeying speed limits, washing hands and
other food safety habits, donating blood, complying with medical regimens, and reducing risky
sexual behaviors (Albarracín et al., 2001; Conner et al., 2013; Elliott et al., 2003; Milton &
Mullan, 2012; Rich et al., 2015; Tyson et al., 2014).
Strength of the Attitude According to the theories of reasoned action and planned behavior,
specific attitudes combine with social factors to produce behavior. Sometimes attitudes have
more influence on behavior than do the other factors; sometimes they have less influence. In
large part, it depends on the importance, or strength, of the attitude. Each of us has some views
that are nearer and dearer to the heart than others. Computer jocks become attached to PCs or to
Macs, religious fundamentalists care deeply about issues pertaining to life and death, and
political activists have fiery passions for one political party or policy over others. In each case,
the attitude is held with great confidence and is difficult to change (Petty & Krosnick, 1995).
Why are some attitudes stronger than others? David Boninger and others (1995) identified
three psychological factors that consistently seem to distinguish between our strongest and
weakest attitudes. These investigators asked people to reflect on their views toward defense
spending, gun control, the legalization of marijuana, abortion rights, and other issues. They
found that the attitudes people held most passionately were those that concerned issues that (1)
directly affected their own self-interests; (2) related to deeply held philosophical, political, and
religious values; and (3) were of concern to their close friends, family, and social ingroups. This
last, highly social, point is important. Research shows that when people are surrounded by others
who are like-minded, the attitudes they hold are stronger and more resistant to change (Visser &
Mirabile, 2004).
Several factors indicate the strength of an attitude and its link to behavior. One is that people
tend to behave in ways that are consistent with their attitudes when they are well informed. For
example, college students were asked which of two candidates they preferred in an upcoming
local election for mayor. Those who knew the factual campaign issues were later the most likely
to actually vote for their favored candidate (Davidson et al., 1985). In another study, college
students were questioned about their views on various environmental issues and later were asked
to take action—to sign petitions, participate in a recycling project, and so on. Again, the more
informed the students were, the more consistent their attitudes about the environment were with
their behavior (Kallgren & Wood, 1986).
The strength of an attitude is indicated not only by the amount of information on which it is
based but also by how that information was acquired. Research shows that attitudes are more
stable and more predictive of behavior when they are born of direct personal experience than
when based on indirect, secondhand information. In a series of experiments, for example, Russell
Fazio and Mark Zanna (1981) introduced two groups of participants to a set of puzzles. One
group worked on sample puzzles; the other group merely watched someone else work on them.
All participants were then asked to rate their interest in the puzzles (attitude) and were given an
opportunity to spend time on them (behavior). As it turned out, attitudes and behaviors were
more consistent among participants who had previously sampled the puzzles.
Third, an attitude can be strengthened, ironically, by an attack against it from a persuasive
message. According to Zakary Tormala and Richard Petty (2002), people hold attitudes with
varying degrees of certainty, and they become more confident in their positions after they
successfully resist changing that attitude in response to a persuasive communication. In one
study, researchers confronted university students with an unpopular proposal to add senior
comprehensive exams as a graduation requirement. Each student read a pro-exam argument that
was described as strong or weak, after which they were asked to write down counterarguments
and indicate their attitude toward the policy. The result: Students who continued to oppose the
policy despite reading what they thought to be a strong argument became even more certain of
their opinion.
Additional studies have shown that this effect depends on how satisfied people are with their
own resistance. When people resist a strong message and believe that they have done so in a
compelling way, they become more certain of their attitude and more likely to form a behavioral
intention that is consistent with it. When people resist a persuasive message “by the skin of their
teeth,” however, and see their own counterarguments as weak, they become less certain of their
initial attitude and more vulnerable to subsequent attack (Tormala et al., 2006). Even if a
person's belief in his or her own thoughtful response is incorrect, it can influence the strength of
the attitude in question (Barden & Petty, 2008).
A fourth key factor is that strong attitudes are highly accessible to awareness, which means that
they are quickly and easily brought to mind (Fazio, 1990). To return to our earlier examples,
computer jocks think often about their computer preferences, and political activists think often
about their allegiances to political parties. It turns out that many attitudes—not just those we feel
strongly about—easily pop to mind by the mere sight or even just the mention of an attitude
object (Bargh et al., 1992). When this happens, the attitude can trigger behavior in a quick,
spontaneous way or by leading us to think carefully about how we feel and how to respond
(Fazio & Towles-Schwen, 1999).
Attitudes in a Cultural Context Bo Kyung Park and colleagues (2013) tell this story: “In the
hallways of a local library in Korea, it is not unusual to see individuals in front of a vending
machine, wavering between two beverage options for a while, eventually making a choice by
doing ‘eeny, meeny, miny, moe.’ Thus a vending machine in the library has the unique option,
‘Random.’ If you press this option, the machine will give you Coke or Sprite randomly! Such an
idea may sound hilarious or absurd to North Americans. However, this vending machine
anecdote nicely illustrates a cultural difference regarding preference and choice” (pp. 106-107).
In Western cultures that value independence, it is common to see our attitudes as a part of who
we are, embodying our values, tastes, preferences, and personalities. Making choices is an
exercise of our attitudes and preferences. From that perspective, it is natural to expect that our
likes and dislikes will remain relatively consistent over time and predictive of behavior. In many
East Asian cultures, however, where independence, choice, and personal preference are less
highly valued, a person's attitude may well not show this level of consistency. Indeed, Hila
Riemer and colleagues (2014) have observed that whereas Western views of attitudes are often
person-centric in these ways, in other parts of the world attitudes depend more on contextual
factors such as social norms, others' expectations, roles, and obligations.
To the extent that norms can change over time and across situations, people in non-Western
cultures may well exhibit less consistency in their attitudes. Research by Wilken and others
(2011) supports this hypothesis. In one study, American and Japanese participants were asked to
report on their favorite musical artists, TV shows, restaurants, and other preferences—and to
estimate how long these preferences have lasted. Overall, the Japanese participants reported
liking their favorites for a shorter period of time than Americans did. In a second study,
participants evaluated how trendy various items were, such as the iPod, Harry Potter books, and
the Wii. When questioned about those same items one year later, American participants stayed
relatively consistent in their evaluations; Japanese participants reported significant changes in
their attitudes.
If attitudes in Western cultures are consistent parts of a person, which we carry with us over
time and across situations, it stands to reason that our attitudes and behavior would be highly
correlated. But does it also stand to reason that attitudes would be less predictive of behavior in
non-Western cultures, where context matters as well? In one study, North American and Indian
participants rated how much they liked everyday items like watches, shoes, and shirts. Later
when they had to choose one item from a set of four, Indians were less likely than Americans to
choose the watch, shoes, or shirts they said they liked the best (Savani et al., 2008). For Indian
participants, their choices presumably depended on factors other than personal preferences. Is the
link between attitudes and behavior stronger in some cultures than others? It is possible. At this
point, however, more research is needed to test this specific proposition.
To sum up: Research on the link between people's attitudes and behavior leads to an important
conclusion. Our evaluations of an object do not always determine our actions because other
factors must be taken into account. However, when attitudes are strong and specific to a
behavior, at least in Western cultures, the effects are beyond dispute. Under these conditions,
voting is influenced by political opinions, consumer purchasing is affected by product attitudes,
and racial discrimination is rooted in feelings of prejudice. Attitudes can be important
determinants of behavior. The question is, how can attitudes be changed?
Persuasion by Communication
On a day-to-day basis, we are all involved in the process of changing attitudes. On TV, on
Facebook pages and blogs, in pop-up ads, magazines, and billboards, advertisers flood us with ad
campaigns designed to sell cars, soft drinks, credit cards, sneakers, prescription drugs, airlines,
new movies, and travel destinations. Likewise, politicians make speeches, run commercials, pass
out bumper stickers, and kiss babies to win votes. Attitude change is sought whenever parents
socialize their children, scientists advance theories and seek funding, religious groups seek
converts, financial analysts recommend stocks, or trial lawyers argue cases to a jury. Some
appeals work; others do not. Some are soft and subtle; others are hard and blatant. Some serve
the public interest, whereas others serve commercial interests. The point is, there is nothing
inherently evil or virtuous about changing attitudes—a process known as persuasion. We do it
all the time.
persuasion The process by which attitudes are changed.
If you wanted to change someone's attitude on an issue, you'd probably try to do it by making a
persuasive communication. Appeals made in person, over the Internet, or through the mass media
rely on the spoken word, the written word, and the image or video that is worth a thousand
words. What determines whether an appeal succeeds or fails? To understand why certain
approaches are effective whereas others are not, social psychologists have long sought to
understand how and why persuasive communications work. For that, we need a road map of the
persuasion process.
▪ Two Routes to Persuasion
It's a familiar scene in American politics: Every four years, two or more presidential candidates
launch extensive—and expensive—campaigns for office. In a way, if you've seen one election,
you've seen them all. The names and dates may change, but over and over again, opposing
candidates accuse each other of ducking substantive issues and turning the election into a high
stakes, money driven, mud-slinging popularity contest.
True or not, these criticisms show that politicians are keenly aware that they can win votes
through two different methods. They can stick to policy, issues, and rational argumentation using
the power of words or they can base their appeals on other grounds. Interestingly, these “other
grounds” can well determine who wins an election. In The Political Brain, Drew Westen (2007)
presents a wealth of research evidence indicating that in the marketplace of politics, emotions
trump reason. Based on a combination of laboratory experiments and public opinion polls, other
political psychologists agree (Brader, 2006; Neuman et al., 2007). Outside the realm of politics
too, influence can be quick and automatic. In Split-Second Persuasion, Kevin Dutton (2010)
described how Buddhist monks, magicians, advertisers, con artists, hostage negotiators, and
other “super-persuaders” use simplicity, empathy, an air of self-confidence, and other disarming
tactics to effect instant persuasion.
To account for the two alternative approaches to persuasion, Richard Petty and John Cacioppo
(1986) proposed a dual-process model of persuasion. This model assumes that we do not always
process communications the same way. When people think hard and critically about the contents
of a message, they are said to take a central route to persuasion and are influenced by the
strength and quality of the arguments. When people do not think hard or critically about the
contents of a message but focus instead on other cues, they take a peripheral route to
persuasion. As we'll see, the route taken depends on whether one is willing and able to
scrutinize the information contained in the message itself. Over the years, this model has
provided an important framework for understanding the factors that elicit persuasion (Petty &
Wegener, 1998).
central route to persuasion The process by which a person thinks carefully about a
communication and is influenced by the strength of its arguments.
peripheral route to persuasion The process by which a person does not think carefully about a
communication and is influenced instead by superficial cues.
In U.S. presidential politics, candidates try to win votes by addressing the issues, as in debates
and speeches delivered from a podium (the central route) or through the use of banners, balloons,
music, and other theatrics (the peripheral route).
The Central Route In the first systematic attempt to study persuasion, Carl Hovland and
colleagues (1949, 1953) started the Yale Communication and Attitude Change Program. They
proposed that for a persuasive message to have influence, the recipients of that message must
learn its contents and be motivated to accept it. According to this view, people can be persuaded
only by an argument they attend to, comprehend, and retain in memory for later use. Regardless
of whether the message takes the form of a live personal appeal, a newspaper editorial, a Sunday
sermon, TV commercial, or a pop-up window on a website, these basic requirements remain the
same.
A few years later, William McGuire (1969) reiterated the information-processing steps
necessary for persuasion and like the Yale group before him distinguished between the learning,
or reception, of a message, a necessary first step, and its later acceptance. In fact, McGuire
(1968) used this distinction to explain the surprising finding that a recipient's self-esteem and
intelligence are unrelated to persuasion. In McGuire's analysis, these characteristics have
opposite effects on reception and acceptance. People who are smart or high in self-esteem are
better able to learn a message, but are less likely to accept its call for a change in attitude. People
who are less smart or low in self-esteem are more willing to accept the message, but they may
have trouble learning its contents. Overall, then, neither group is generally more vulnerable to
persuasion than the other—a prediction that is supported by a good deal of research (Rhodes &
Wood, 1992).
Anthony Greenwald (1968) and others then argued that persuasion requires a third,
intermediate step: elaboration. To illustrate, imagine you are offered a job and your prospective
employer tries to convince you over lunch to accept. You listen closely, learn the terms of the
offer, and understand what it means. But if it's a really important interview, your head will spin
with questions as you weigh all the pros and cons and contemplate the implications: What would
it cost to move? Is there potential for advancement? Am I better off staying where I am? When
confronted with personally significant messages, we don't listen merely to collect information;
we think about that information. When this happens, the message is effective to the extent that it
leads us to focus on favorable rather than unfavorable thoughts.
elaboration The process of thinking about and scrutinizing the arguments contained in a
persuasive communication.
These theories of attitude change all share the assumption that the recipients of persuasive
appeals are attentive, active, critical, and thoughtful of every word spoken. This assumption is
correct—some of the time. When it is and when people consider a message carefully, their
reaction to it depends on the strength of its contents. In these instances, messages have greater
impact when they are easily learned rather than difficult, when they are memorable rather than
forgettable, and when they stimulate a good deal of favorable rather than unfavorable
elaboration. Ultimately, strong arguments are persuasive and weak arguments are not.
On the central route to persuasion, the process is eminently rational. It's important to note,
however, that thinking hard and carefully about a persuasive message does not guarantee that the
process is objective or that it necessarily promotes truth seeking. At times, each of us prefers to
hold a particular attitude and become biased in the way we process information (Petty &
Wegener, 1998). Among college students who were politically conservative or liberal, the
tendency to agree with a social welfare plan was influenced more—rapidly, strongly, and
persistently—by whether it was said to have the support of Democrats or Republicans than by
the logical merits of the policy itself (Cohen, 2003; Smith et al., 2012). Similarly, college
students were less likely to be persuaded by a proposed tuition hike to fund campus
improvements when the increase would take effect in one year, thus raising the personal stakes,
than by a proposal to raise tuition in eight years (Darke & Chaiken, 2005).
There is one additional complicating factor to consider. Several years ago, Petty and his
colleagues (2002) proposed the self-validation hypothesis that people not only “elaborate” on a
persuasive communication with positive or negative attitude-relevant thoughts; they also seek to
assess the validity of these thoughts. Those thoughts that we hold with high confidence will have
a strong impact on our attitudes, as predicted by the Dual-Process Model of persuasion; but those
we hold with low confidence will not have a strong impact on our attitudes. This means that
various aspects of a persuasive communication—such as whether the source is a knowledgeable
expert, and whether the position he or she advocates is one with which we agree or disagree—
can affect the confidence we have in our own thoughts and, in turn, our attitudes (see Briñol &
Petty, 2009).
The Peripheral Route “The receptive ability of the masses is very limited, their understanding
small; on the other hand, they have a great power of forgetting.” The author of this cynical
assessment of human nature was Adolf Hitler (1933, p. 77). Believing that human beings are
incompetent processors of information, Hitler relied in his propaganda on the use of slogans,
uniforms, swastika-covered flags, a special salute, and other symbols. For Hitler, “meetings were
not just occasions to make speeches; they were carefully planned theatrical productions in which
settings, lighting, background music, and the timing of entrances were devised to maximize the
emotional fervor of an audience” (Qualter, 1962, p. 112).
Do these ploys work? Can the masses be so handily manipulated into persuasion? History
shows that they can. Audiences are not always thoughtful. Sometimes people do not follow the
central route to persuasion but instead take a shortcut through the peripheral route. Rather than
try to learn about a message and think through the issues, they respond with little effort on the
basis of superficial peripheral cues.
On the peripheral route to persuasion, people will often evaluate a communication by using
simple-minded heuristics, or rules of thumb (Chaiken, 1987; Chen & Chaiken, 1999). If a
communicator has a good reputation, speaks fluently, or writes well, we tend to assume that his
or her message must be correct. And when a speaker has a reputation for being honest, people
think less critically about the contents of his or her communication (Priester & Petty, 1995).
Likewise, we assume that a message must be correct if it contains a long litany of arguments or
statistics or an impressive list of supporting experts, if it's familiar, if it elicits cheers from an
audience, or if the speaker seems to argue against his or her own interests. In some cases, people
will change their attitudes simply because they know that an argument has majority support
(Giner-Sorolla & Chaiken, 1997).
On the mindless peripheral route, people are also influenced by a host of factors that are not
relevant to attitudes—such as cues from their own body movements. Increasingly, social
psychologists are coming to appreciate the extent to which human thought is embodied—that the
way we think and feel about things is influenced by the physical position, orientation, and
movements of our bodies (Lakoff & Johnson, 1999; Niedenthal et al., 2005).
Several studies illustrate attitude embodiment effects. In one study, participants were coaxed
into nodding their heads up and down (as if saying yes) or shaking their heads from side to side
(as if saying no) while listening via headphones to an editorial, presumably to test whether the
headphones could endure the physical activity. Those coaxed into nodding later agreed more
with the arguments than those coaxed into shaking their heads from side to side (Wells & Petty,
1980). In other studies, participants viewed graphic symbols or word-like stimuli (swrtel, primet)
while using an exercise bar to either stretch their arms out (which mimics what we do to push
something away) or flex their arms in (which we do to bring something closer). Participants later
judged these stimuli to be more pleasant when they were associated with the flexing of the arm
than when they were associated with the stretching-out motion (Cacioppo et al., 1993; Priester
et al., 1996). Even our attitudes toward consumer products can be influenced by bodily
sensations. In one study, for example, participants evaluated the appearance of vases, flowers,
and other products placed at a distance as more appealing when they stood on a soft, comfortable
carpet than on a hard tile floor (Meyers-Levy et al., 2010).
Route Selection Thanks to Petty and Cacioppo's (1986) two-track distinction between the
central and peripheral routes, it is easy to understand why the persuasion process seems so
logical on some occasions yet so illogical on others—why voters may select candidates
according to issues or images, why juries may base their verdicts on evidence or a defendant's
appearance, and why consumers may base their purchases on marketing reports or on product
images. The process that is engaged depends on whether the recipients of a persuasive message
have the ability and the motivation to take the central route or whether they rely on peripheral
cues instead.
To understand the conditions that lead people to process information on one route or the other,
it's helpful to view persuasive communication as the outcome of three factors: a source (who),
a message(says what and in what context), and an audience (to whom). Each of these factors
steers a recipient's approach to the communication. If a source speaks clearly, if the message is
important, if there is a bright, captive, and involved audience that cares deeply about the issue
and has time to absorb the information, then audience members will be willing and able to take
the effortful central route. But if the source speaks at a rate too fast to comprehend, if the
message is trivial or too complex to process, or if audience members are distracted, pressed for
time, or uninterested, then the less strenuous peripheral route is taken.
• FIGURE 6.5 Two Routes to Persuasion
Based on aspects of the source, message, and audience, recipients of a communication take either
a central or peripheral route to persuasion. On the central route, people are influenced by strong
arguments and evidence. On the peripheral route, persuasion is based more on heuristics and
other superficial cues. This two-process model helps explain how persuasion can seem logical on
some occasions and illogical on others.
In reacting to persuasive communications, people are influenced more by superficial
images than by logical arguments.
FALSE
• Figure 6.5 presents a road map of persuasive communication. In the next three sections, we
will follow this map from the input factors (source, message, and audience) through the central
or peripheral processing routes to reach the final destination: persuasion.
▪ The Source
Golfer Tiger Woods is a living legend, one of the most gifted athletes of our time. Until recently,
he was also paid more millions of dollars per year than just about anyone else—to endorse Nike,
American Express, and other products. Woods was considered a highly effective spokesman
until 2009 when various extramarital affairs were exposed, causing the breakup of his marriage
and the demise of his championship caliber golf game. He has struggled since that time and is no
longer the top-ranked golfer in the world or the most highly sought spokesperson. What does the
story of Tiger Woods tells us about source effects in persuasion? More specifically, what makes
some communicators in general more effective than others? As we'll see, there are two key
attributes: credibility and likability.
Credibility Imagine you are waiting in line in a supermarket and you catch a glimpse of this
large boldfaced headline: “Doctors Discover Cure for AIDS!” As your eye wanders across the
front page, you discover that you are reading a supermarket tabloid that features aliens from
another planet. What would you think? In contrast, imagine that you are reading through
scientific periodicals in a library when you come across a similar article, but this time it appears
in the New England Journal of Medicine. Now what would you think?
Chances are, you'd react with more excitement to the medical journal than to the tabloid, even
though both sources report the same news item. In a study conducted during the Cold War era of
the 1950s, American participants read a speech that advocated for the development of nuclear
submarines. The speech elicited more agreement when it was attributed to an eminent American
physicist than when the source was said to be the Soviet government-controlled newspaper
(Hovland & Weiss, 1951). Likewise, when participants read a lecture favoring more lenient
treatment of juvenile offenders, they changed their attitudes more when they thought the speaker
was a judge than when they believed the speaker was a convicted drug dealer (Kelman &
Hovland, 1953). Recent research confirms that high-credibility sources are generally more
persuasive than low-credibility sources (Pornpitakpan, 2004).
Why are some sources more credible than others? Why were the medical journal, the physicist,
and the judge more credible than the tabloid, government-controlled newspaper, and drug dealer?
For communicators to be seen as credible, they must have two characteristics: competence and
trustworthiness. Competence refers to a speaker's ability. People who are knowledgeable, smart,
or well spoken or who have impressive credentials are persuasive by virtue of their expertise
(Hass, 1981). Experts can have a disarming effect on us. We assume they know what they're
talking about. So when they speak, we listen. And when they take a position, often we yield.
Indeed, research shows that people pay attention more closely to experts than to nonexperts and
scrutinize their arguments more carefully (Tobin & Raymundo, 2009).
The impact of experts on our attention, confidence, and attitudes, is not simple or uniform. The
effect depends on how we feel about the attitude they advocate—say, on politically charged
issues such as gun control, immigration policy, or climate change. As suggested by the self-
validation hypothesis described earlier, a highly credible source who argues for a position we
tend to favor (yay!) bolsters our confidence and existing attitude—more than someone less
credible would. In this case, there is less need to scrutinize the expert than the supportive but
questionable nonexpert. But a highly credible source who advocates for a position we oppose
(yikes!) poses a real threat to our confidence and existing attitude—more than someone less
credible. In this instance, we need to scrutinize the expert more than the nonexpert. Research is
consistent with this nuanced hypothesis: People scrutinize nonexperts more than experts when
they advocate a position we agree with, but they scrutinize nonexperts more when they advocate
a position we oppose (Clark et al., 2012; Clark & Evans, 2014; Clark & Wegner, 2013).
▴ TABLE 6.1 Who Do You Trust?
Please tell me how you would rate the honesty and ethical standards of people in these different
fields—very high, high, average, low, or very low?
Occupation
% Very high/High
Nurses
80
Medical doctors
65
Pharmacists
65
Police officers
48
Clergy
46
Bankers
23
Lawyers
21
Business executives
17
Advertising practitioners
10
Car salespeople
8
Members of Congress
7
In December of 2014, a Gallup poll was conducted to determine the level of honesty attributed to
people from various occupational groups. Indicated alongside are the percentages of respondents
who rated each group as “high” or “very high” in honesty.
Expertise is only one aspect of credibility. To have credibility, however, sources must also
be trustworthy—that is, they must be seen as willing to report their knowledge truthfully and
without compromise. What determines whether we trust a communicator? To some extent, we
make these judgments on the basis of stereotypes. In 2014, in an update to a survey conducted
many times over the years, the Gallup Organization asked 1,000 Americans to rate how honest
people are from various occupational categories. As shown in Table 6.1, nurses topped the list as
the most trusted occupational group. Car salesmen, members of Congress, and advertisers were
the least trusted.
At an October 2014 press conference at Tsinghua University in Beijing, Facebook co-founder
Mark Zuckerberg stunned the crowd by speaking and answering questions in Mandarin. In his
effort to bring Facebook to China, “Zuck” may have enhanced his persuasive appeal by
appearing more similar and, hence, more likeable.
In judging the credibility of a source, common sense arms us with a simple rule of caution:
Beware of people who have something to gain from successful persuasion. If a speaker has been
paid off, has an ax to grind, or is simply telling us what we want to hear, we suspect some degree
of bias. This rule sheds light on a classic dilemma in advertising concerning the value of
celebrity spokespersons: The more products a celebrity endorses, the less trustworthy he or she
appears to consumers (Tripp et al., 1994). In the courtroom, the same rule of caution can be
used to evaluate witnesses. In one study, research participants served as jurors in a mock trial
involving a man who claimed that his exposure to an industrial chemical at work had caused him
to contract cancer. Testifying in support of this claim was a biochemist paid either $4,800 or $75
for his expert testimony. You might think that jurors would be more impressed by the scientist
who commanded the higher fee. Yet while he was highly paid, the expert was perceived to be a
“hired gun” and was as a result less believable and less persuasive (Cooper & Neuhaus, 2000).
The self-interest rule has other interesting implications. One is that people are impressed by
others who take unpopular stands or argue against their own interests. When research
participants read a political speech accusing a large corporation of polluting a local river, those
who thought the speechmaker was a pro-environment candidate addressing a staunch
environmentalist group perceived him to be biased, whereas those who thought he was a probusiness candidate talking to company supporters assumed he was sincere (Eagly et al., 1978).
Trust is also established by speakers who do not purposely try to change our views. Thus,
people are influenced more when they think that they are accidentally overhearing a
communication than when they receive a sales pitch clearly intended for their ears (Walster &
Festinger, 1962). That's why advertisers sometimes use the “overheard communicator” trick, in
which the source appears to tell a buddy about a new product that works. As if eavesdropping on
a personal conversation, viewers assume that what one friend says to another can be trusted. The
self-interest rule also has great relevance in law, which is why people are far more likely to
believe a crime suspect's admissions of guilt than his or her denials (Kassin, 2012).
Likability More than anything else, the celebrity star power of Tiger Woods was based on his
athletic prowess, his popularity, and his winning smile. Before the revelations that ended Woods'
marriage and derailed his game, he was seen as a likable person. But does that quality necessarily
enhance someone's impact as a communicator? Yes. In his classic bestseller, How to Win
Friends and Influence People, Dale Carnegie (1936) said that being liked and being persuasive
go hand in hand. The question is, what makes a communicator likable? As we'll see in Chapter
9, two factors that spark attraction are similarity and physical attractiveness.
A study by Diane Mackie and others (1990) illustrates the persuasive power of similarity.
Students enrolled at the University of California, Santa Barbara, read a strong or a weak speech
that argued against continued use of the SATs in college admissions. Half the participants were
led to believe that the speech was written by a fellow UCSB student; the other half thought the
author was a student from the University of New Hampshire. Very few participants were
persuaded by the weak arguments. In contrast, many of those who read the strong message did
change their attitudes, but only when they believed it was given by a fellow UCSB student.
Just as source similarity can spark persuasion, dissimilarity can have the opposite inhibiting
effect. In a study of people's taste in music, Clayton Hilmert and others (2006) introduced
participants to a confederate who seemed to like the same or different kinds of music, such as
rock, pop, country, or classical. Others did not meet a confederate. When later asked to rate a
particular song, participants were positively influenced by the similar confederate's opinion and
negatively influenced by the dissimilar confederate's opinion. In fact, although the effect is more
potent when the points of similarity seem relevant to the attitude in question (Berscheid, 1966),
the participants in this study were also more or less persuaded by a confederate whose
similarities or differences were wholly unrelated to music—for example, when the confederate
had similar or different interests in shopping, world politics, museums, foods, or social media
websites.
When he was just out of high school, basketball star LeBron James (left) signed a multimillion
dollar contract with Nike. Also paid millions, music star Taylor Swift is surrounded by kittens in
this TV commercial for Diet Coke (top right) and Oscar-winning actor Bradley Cooper is the
new, and first-ever, face of Haagen-Dazs ice cream (bottom right). Can celebrities sell products?
Targeting the peripheral route to persuasion, the advertising industry seems to think so.
The effect of source similarity on persuasion has obvious implications for those who wish to
exert influence. We're all similar to one another in some respects. We might agree in politics,
share a common friend, have similar tastes in food, or enjoy spending summers on the same
beach. If aware of the social benefits of similarity and the social costs of dissimilarity, the astute
communicator can use common bonds to enhance his or her impact on an audience.
Advertising practices presuppose that beauty is also persuasive. After all, online, magazine,
and TV ads routinely feature supermodels who are young, tall, and slender supermodels (for
women) or muscular (for men) and who sport hard bodies, glowing complexions, and radiant
smiles. Sure, these models can turn heads, you may think, but can they change minds? In a study
that addressed this question, Shelly Chaiken (1979) had male and female college students
approach others on campus. They introduced themselves as members of an organization that
wanted the university to stop serving meat during breakfast and lunch. In each case, these student
assistants gave reasons for the position and then asked respondents to sign a petition. The result:
Attractive sources were able to get 41% of respondents to sign the petition, whereas those who
were less attractive succeeded only 32% of the time. Additional research has shown that
attractive male and female salespersons elicit more positive attitudes and purchasing intentions
from customers than less attractive salespersons, even when they are up front about their desire
to make a sale (Reinhard et al., 2006).
Advertisers are so convinced that beauty sells products that they pay millions of dollars for
supermodels to appear in their ads. Shown here, supermodel Kate Moss appears in an ad for
Italian fashion house Versace.
When What You Say Is More Important Than Who You Are To this point, it must seem as if
the source of a persuasive communication is more important than the communication itself. Is
this true? Certainly there are enough real-life examples—as when books used to skyrocket to the
top of the best-seller list because Oprah Winfrey recommended them. The advertising industry
has long debated the value of high-priced celebrity endorsements. David Ogilvy (1985), who was
called “the king of advertising,” used to say that celebrities are not effective because viewers
know they've been bought and paid for. Ogilvy was not alone in his skepticism. Still, many
advertisers scramble furiously to sign famous entertainers and athletes. From Tiger Woods to
Tina Fey, Rihanna, LeBron James, Peyton Manning, Taylor Swift, Scarlett Johansson, Bono,
Danica Patrick, Beyonce Knowles, Katy Perry, Jay Z, and Bradley Cooper, TV commercials
regularly feature a parade of stars. The bigger the star, they say, the more valuable the
testimonial.
Compared with the contents of a message, does the source really make the big difference that
advertisers pay for? Are we so impressed by the expert, so enamored of the physical talent, and
so drawn to the charming face that we embrace whatever they have to say? And are we so
scornful of ordinary or unattractive people that their presentations fall on deaf ears? In light of
what is known about the central and peripheral routes to persuasion, the answer to these
questions is “it depends.”
First, a recipient's level of involvement plays an important role. When a message has personal
relevance to your life, you pay attention to the source and think critically about the message, the
arguments, and the implications. When a message does not have personal relevance, however,
you may take the source at face value and spend little time scrutinizing the information. In a
classic study, Richard Petty and others (1981) had students listen to a speaker who proposed that
all seniors should be required to take comprehensive exams in order to graduate. Three aspects of
the communication situation were varied. First, participants were led to believe that the speaker
was either an education professor at Princeton University or a high school student. Second,
participants heard either well-reasoned arguments and hard evidence or a weak message based
only on anecdotes and personal opinion. Third, participants were told either that the proposed
exams might be used the following year (Uh oh, that means me!) or that they would not take
effect for another 10 years (Who cares, I'll be long gone by then!).
• FIGURE 6.6 Source Versus Message: The Role of Audience
Involvement
People who were high or low in their personal involvement heard a strong or weak message from
an expert or nonexpert. For high-involvement participants (left), persuasion was based on the
strength of arguments, not on source expertise. For lowinvolvement participants (right),
persuasion was based more on the source than on the arguments. Source characteristics have
more impact on those who don't care enough to take the central route.
As predicted, personal involvement determined the relative impact of the expertise of the
source and the quality of speech. Among participants who would not be affected by the proposed
change, attitudes were based largely on the speaker's credibility: The professor was persuasive;
the high school student was not. Among participants who thought that the proposed change
would affect them directly, attitudes were based on the quality of the speaker's proposal. Strong
arguments were persuasive; weak arguments were not. As depicted in • Figure 6.6, people
followed the source rather than the message under low levels of involvement, illustrating the
peripheral route to persuasion. But message factors did outweigh source characteristics under
high levels of involvement, when participants cared enough to take the central route to
persuasion. Likewise, research has shown that the tilt toward likable and attractive
communicators is reduced when recipients take the central route (Chaiken, 1980).
There is a second limit to source effects. It is often said that time heals all wounds. Well, time
may also heal the effects of a bad reputation. Hovland and Weiss (1951) varied communicator
credibility (for example, the physicist versus the Soviet-controlled newspaper) and found that the
change had a large and immediate effect on persuasion. But when they measured attitudes again
four weeks later, the effect had vanished. Over time, the attitude change produced by the highcredibility source had decreased and the change caused by the low-credibility source had
increased. This finding of a delayed persuasive impact of a low-credibility communicator is
called the sleeper effect.
sleeper effect A delayed increase in the persuasive impact of a noncredible source.
To explain this unforeseen result, the Hovland research group proposed the discounting cue
hypothesis. According to this hypothesis, people immediately discount the arguments made by
noncredible communicators, but over time, they dissociate what was said from who said it. In
other words, we tend to remember the message but forget the source (Pratkanis et al., 1988). To
examine the role of memory in this process, Kelman and Hovland (1953) reminded a group of
participants of the source's identity before reassessing their attitudes. If the sleeper effect was
caused by forgetting, they reasoned, then it could be eliminated through reinstatement of the link
between the source and the message. As shown in • Figure 6.7, they were right. When attitudes
were measured after three weeks, participants who were not reminded of the source showed the
usual sleeper effect. Yet those who did receive a source reminder did not. For these latter
participants, the effects of high and low credibility endured. Recent studies by cognitive
psychologists have confirmed that over time, people “forget” the connection between
information and its source (Underwood & Pezdek, 1998).
• FIGURE 6.7 The Sleeper Effect
In Experiment 1, participants changed their immediate attitudes more in response to a message
from a high-credibility source than from a low-credibility source. When attitudes were measured
again after three weeks, the high-credibility source had lost impact and the low-credibility source
had gained impact—the sleeper effect. In Experiment 2, the sleeper effect disappeared when
participants were reminded of the source.
The sleeper effect generated a good deal of controversy. There was never a doubt that credible
communicators lose some impact over time. But researchers had a harder time finding evidence
for delayed persuasion by noncredible sources. Exasperated at one point by their own failures to
obtain this result, Paulette Gillig and Anthony Greenwald (1974) wondered, “Is it time to lay the
sleeper effect to rest?” As it turned out, the answer was no. More recent research showed that the
sleeper effect is reliable provided that participants do not learn who the source is until after they
have received the original message (Greenwald et al., 1986; Kumkale & Albarracín, 2004).
“The truth is always the strongest argument.”
—Sophocles
To appreciate the importance of timing, imagine that you're searching for music online when
you come across what appears to be a review of a new CD. Before you begin reading, however,
you notice in the fine print that this so-called review is really an advertisement. Aware that you
can't always trust what you read, you skim the ad and reject it. Now imagine the same situation,
except that you read the entire ad before realizing what it is. Again you reject it. But notice the
difference. This time, you have read the message with an open mind. You may still reject it, but
after a few weeks, the information will have sunk in to influence your evaluation of the music.
This scenario illustrates the sleeper effect.
▪ The Message
On the peripheral route to persuasion, audiences are influenced heavily, maybe too heavily, by
various source characteristics. But when people care about an issue, the strength of a message
determines its impact. On the central route to persuasion, what matters most is whether a
scientist's theory is supported by the data or whether a company has a sound product. Keep in
mind, however, that the target of a persuasive appeal comes to know a message only through the
medium of communication—what a person has to say and how that person says it.
Informational Strategies Communicators often struggle with how to structure and present an
argument to maximize its impact. Should a message be long and crammed with facts or short and
to the point? Is it better to present a highly partisan, one-sided message or to take a more
balanced, two-sided approach? How should arguments be ordered—from strongest to weakest or
the other way around? These are the kinds of questions often studied by persuasion researchers—
including those interested in marketing, advertising, and consumer behavior.
Often the most effective strategy to use will depend on whether members of the audience
process the message on the central or the peripheral route. Consider the length of a
communication. When people process a message lazily, with their eyes and ears half-closed, they
often fall back on a simple heuristic: The longer a message, the more valid it must be. In this
case, a large number of words gives the superficial appearance of factual support regardless of
the quality of the arguments (Petty & Cacioppo, 1984; Wood et al., 1985). Thus, as David
Ogilvy (1985) concluded from his years of advertising experience, “The more facts you tell, the
more you sell” (p. 88).
When people process a communication carefully, however, length is a two-edged sword. If a
message is long because it contains lots of supporting information, then longer does mean better.
The more supportive arguments you can offer or the more sources you can find to speak on your
behalf, the more persuasive your appeal will be (Harkins & Petty, 1981). But if the added
arguments are weak or if the new sources are redundant, then an alert audience will not be fooled
by length alone. When adding to the length of a message dilutes its quality, an appeal might
well lose impact (Friedrich et al., 1996; Harkins & Petty, 1987).
When two opposing sides try to persuade the same audience, order of presentation becomes a
relevant factor as well. During the summer of 2012, before the November presidential election,
the Republicans held their national convention a few days before the incumbent Democrats held
theirs. These events were watched on television by millions of voters. Do you think the order in
which they were scheduled gave one party an advantage? If you believe that information that is
presented first has more impact, you'd predict a primacy effect (advantage to the Republicans). If
you believe that the information presented last has the edge, you'd predict a recency
effect (advantage to the Democrats).
There are good reasons for both predictions. On the one hand, first impressions are important.
On the other hand, memory fades over time, and often people recall only the last argument they
hear before making a decision. In light of these contrasting predictions, Norman Miller and
Donald Campbell (1959) searched for the “missing link” that would determine the relative
effects of primacy and recency. They discovered that the missing link is time. In a study of jury
simulations, they had people (1) read a summary of the plaintiff's case; (2) read a summary of the
defendant's case; and (3) make a decision. The researchers varied how much time separated the
two messages and then how much time elapsed between the second message and the decisions.
When participants read the second message right after the first and then waited a whole week
before reporting their opinion, a primacy effect prevailed and the side that came first was
favored. Both messages faded equally from memory, so only the greater impact of first
impressions was left. When participants made a decision immediately after the second message
but a full week after the first, however, there was a recency effect. The second argument was
fresher in memory, thus favoring the side that went last. Using these results as a guideline, let's
revisit our original question: What is the impact on Election Day of how national conventions are
scheduled? Think for a moment about the placement and timing of these events. The answer
appears in Table 6.2.
Research on the sleeper effect shows that people often remember the message but forget the
source.
Message Discrepancy Persuasion is a process of changing attitudes. This objective is not easy to
achieve. As a general rule, people are motivated to defend their opinions and attitudes, which
they do, in part, through selective exposure to information that supports their views (Hart et al.,
2009). Given an opportunity to advocate for attitude change, communicators confront what is
perhaps the most critical strategic question: How extreme a position should they take?
How discrepant should a message be from the audience's existing position in order to have the
greatest impact? Common sense suggests two opposite answers. One approach is to take an
extreme position in the hope that the more change you advocate, the more you will get. Another
approach is to exercise caution and not push for too much change so that the audience will not
reject the message outright. Which approach seems more effective? Imagine trying to convert
your politically conservative friends into liberals or the other way around. Would you stake out a
radical position in order to move them toward the center or would you preach moderation so as
not to be cast aside?
Research shows that communicators should adopt the second, more cautious approach. To be
sure, some discrepancy is needed to produce a change in attitude. But the relationship to
persuasion can be pictured as an upside-down U with the most change being produced at
moderate amounts of discrepancy (Bochner & Insko, 1966). A study by Kari Edwards and
Edward Smith (1996) helps explain why taking a more extreme position is counterproductive.
These investigators first measured people's attitudes on a number of hot social issues–for
example, whether lesbian and gay couples should adopt children, whether employers should give
preference in hiring to minorities, and whether the death penalty should be abolished. Several
weeks later, they asked the same people to read, think about, and rate arguments that were either
consistent or inconsistent with their own prior attitudes. The result: When given arguments to
read that preached attitudes that were discrepant from their own, the participants spent more time
scrutinizing the material and judged the arguments to be weak. Clearly, people tend to refute and
reject persuasive messages they don't agree with. In fact, the more personally important an issue
is to us, the more stubborn and resistant to change we become (Zuwerink & Devine, 1996).
▴ TABLE 6.2 Effects of Presentation Order and Timing on
Persuasion
A study by Miller and Campbell (1959) demonstrated the effect of presentation order and the
timing of opposing arguments on persuasion. As applied to our example, the Democratic and
Republican conventions resemble the fourth row of this table. From these results, it seems that
the scheduling of such events is fair, promoting neither primacy nor recency.
Conditions
Results
•
1. Message 1
Message 2
One week
Decision
Primacy
•
2. Message 1
One week
Message 2
Decision
Recency
•
3. Message 1
Message 2
Decision
None
•
4. Message 1
One week
Message 2
One week
Decision
None
Fear Appeals Many trial lawyers say that to win cases they have to appeal to jurors through the
heart rather than through the mind. The evidence is important, they admit, but what matters most
is whether the jury reacts to their client with anger, disgust, sympathy, or sadness. Of course,
very few messages are entirely based on rational argument or on emotion.
Fear is a particularly primitive and powerful emotion, serving as an early warning system that
signals danger. Neuroscience research shows that fear is aroused instantly in response to pain,
stimulation from noxious substances, or threat, enabling us to respond quickly without having to
stop to think about it (LeDoux, 1996). The use of fear-based appeals to change attitudes is
common. Certain religious cults have used scare tactics to indoctrinate new members. So do
public health organizations that graphically portray the damage done to those who smoke
cigarettes, eat too much junk food, drink too much alcohol, text while driving, or have
unprotected sex.
Negative campaigning in American politics is prevalent, though the effects on voters are not
clear (Lau & Rovner, 2009). From the large number of attack ads that flood the political scene,
it would certainly seem that candidates, their consultants, and Super PACS strongly believe in
the power of attacking their opponents by arousing fear about the consequences of voting for
them. Presidential campaign ads are more negative than ever. Yet the most hard-hitting and
controversial ever was a TV commercial that aired just once, on September 7, 1964. In an ad to
reelect Democratic president Lyndon Johnson, who was running against Republican Barry
Goldwater, a young girl pictured in a field counted to 10 as she picked the petals off a daisy. As
she reached 9, an adult voice broke in to count down from 10 to 0, followed by a blinding
nuclear explosion and this message: “Vote for President Johnson on November 3. The stakes are
too high for you to stay home.”
The effects of fear arousal in politics are evident. Guided by Terror Management Theory
(Greenberg et al., 1997; Pyszczynski et al., 2004; see Chapter 3) and the prediction that a
deeply rooted fear of death motivates people to rally around their leaders as a way to ward off
anxiety, Mark Landau and others (2004) found that college students expressed more support for
then-President George W. Bush and his policies when they were reminded of their own mortality
or subliminally exposed to images of 9/11 than when they were not. This result is not limited to
the laboratory. Analyzing patterns of government-issued terror warnings and Gallup polls, Robb
Wilier (2004) found that increased terror alerts were predictably followed by increases in
presidential approval ratings.
Is fear similarly effective for commercial purposes? What about using fear to promote health
and safety? If you're interested in public service advertising, visit the website of the Ad Council,
the organization that created Smokey Bear (“Only you can prevent forest fires”) and the crash
test dummies (“Don't be a dummy—buckle up”). In recent years, the Ad Council has run
campaigns on a range of issues such as the dangers of using of steroids, flu prevention, AIDS
prevention, cyber bullying, and the online sexploitation of youth. To get people to change
behavior in these domains, is it better to arouse a little nervousness or a full-blown anxiety
attack? To answer this question, social psychologists over the years have compared
communications that vary in the levels of fear they arouse. In the first such study, Irving Janis
and Seymour Feshbach (1953) found that high levels of fear did not generate increased
agreement with a persuasive communication. Since then, however, research has shown that
appeals that arouse high levels of fear can be highly effective (de Hoog et al., 2007).
Public health organizations often use fear, or scare tactics, to change health-related attitudes and
behavior. In 2014, a public service ad was aired on the dangers of texting while driving. In what
starts out as a fun car ride with friends, this driver takes her eye of the road to text. All of a
sudden her car is T-boned by a truck, hurls over and over in slow motion, and lands with a silent
thud. In a spot aimed at teenagers, the theme is: “Udrive. U text. Upay.”
Fear arousal increases the incentive to change for those who do not actively resist it, but its
ultimate impact depends on the strength of the arguments and on whether the message also
contains clear and reassuring advice on how to cope with the threatened danger (Keller,
1999; Leventhal, 1970; Rogers, 1983). This last point is important. Without specific instructions
on how to cope, people feel helpless, and they panic and tune out the message. In one study, for
example, participants with a chronic fear of cancer were less likely than others to detect the
logical errors in a message that called for regular cancer checkups (Jepson & Chaiken, 1990).
When clear instructions are included, however, high dosages of fear can be effective. In the past,
research had shown that antismoking films elicit more negative attitudes toward cigarettes when
they show gory lung-cancer operations than when they show charts filled with dry statistics
(Leventhal et al., 1967) and that films about driving safety are more effective when they show
bloody accident victims than when they show plastic crash test dummies (Rogers & Mewborn,
1976). In a meta-analysis of 105 studies, however, Natascha de Hoog and others (2007) found
that communications that arouse fear do not have to be gruesome to be effective. The more
personally vulnerable people feel about a threatened outcome, the more attentive they are to the
message and the more likely they are to follow its recommendations.
In a highly controversial appeal to fear, a TV commercial that aired only once during the 1964
presidential campaign, pictured this young girl in an open field, a blinding nuclear explosion, and
the message: “Vote for President Johnson on November 3. The stakes are too high for you to stay
home.”
Positive Emotions It's interesting that just as fear helps induce a change in attitude, so does
positive emotion. In one study, people were more likely to agree with a series of controversial
arguments when they snacked on peanuts and soda than when they did not eat (Janis et al.,
1965). In another study, participants liked a television commercial more when it was embedded
within a program that was upbeat rather than sad (Mathur & Chattopadhyay, 1991). Research
shows that people are “soft touches” when they're in a good mood. Depending on the situation,
food, drinks, a soft reclining chair, warm and tender memories, a success experience,
breathtaking scenery, laughter, and good music can lull us into a positive emotional state and
ripe for persuasion (Schwarz et al., 1991).
Accor...
Purchase answer to see full
attachment