Save

The Psychology of State-Sponsored Disinformation Campaigns and Implications for Public Diplomacy

In: The Hague Journal of Diplomacy
Authors:
Erik C. Nisbet School of Communication and Political Science (by courtesy), and Eurasian Security and Governance Program, Mershon Center for International Security Studies, The Ohio State University Columbus, OH 43201 United States

Search for other papers by Erik C. Nisbet in
Current site
Google Scholar
PubMed
Close
and
Olga Kamenchuk Eurasian Security and Governance Program, Mershon Center for International Security Studies, The Ohio State University Columbus, OH 43201 United States

Search for other papers by Olga Kamenchuk in
Current site
Google Scholar
PubMed
Close
Free access

Summary

Policy discourse about disinformation focuses heavily on the technological dimensions of state-sponsored disinformation campaigns. Unfortunately, this myopic focus on technology has led to insufficient attention being paid to the underlying human factors driving the success of state-sponsored disinformation campaigns. Academic research on disinformation strongly suggests that belief in false or misleading information is driven more by individual emotional and cognitive responses — amplified by macro social, political and cultural trends — than specific information technologies. Thus, attention given to countering the distribution and promulgation of disinformation through specific technological platforms, at the expense of understanding the human factors at play, hampers the ability of public diplomacy efforts countering it. This article addresses this lacuna by reviewing the underlying psychology of three common types of state-sponsored disinformation campaigns and identifying lessons for designing effective public diplomacy counter-strategies in the future.

Introduction

Since the Russian Federation’s takeover of Crimea, and now the investigation of Russian disinformation operations targeting the 2016 US elections, understanding the spread and influence of state-sponsored disinformation — the purposeful spread of false or misleading information — has become a highly salient issue for public diplomacy scholars and practitioners alike. Although there has been a great deal of policy attention on the technical, network and organizational dynamics of disinformation campaigns (such as bots and algorithms), the social–psychological mechanisms by which disinformation strategies influence public opinion and perceptions have not received the same level of scrutiny. This technologically deterministic approach creates misperceptions about the role of information technology in promulgating belief in disinformation that may ‘distract’ from designing effective counter-strategies.1 In order to design effective strategies and develop new technologies for countering disinformation, we thus first need to understand how disinformation campaigns take advantage of our human vulnerabilities to false or misleading information.2

This article attempts to fill this lacuna by reviewing the psychological mechanisms underlying common forms of disinformation campaigns and the implications for designing public diplomacy efforts to counter their influence. The three forms of disinformation campaigns discussed — identity grievance, information gaslighting and incidental exposure — are general archetypes of disinformation operations that underlie a range of state-sponsored disinformation efforts targeting foreign audiences. By unpacking the psychological dimensions of each, we gain a better understanding of how to design corrective messaging, technology and other public diplomacy efforts to counter their influence. The article concludes with a discussion on how public diplomacy may best take advantage of technology to counter state-sponsored disinformation campaigns and promote resilience among target populations.

Identity-Grievance Disinformation Campaigns

‘Identity-grievance’ campaigns focusing on activating polarized social identities (political, ethnic, national, racial and religious, etc.), exploiting real or perceived political, economic, religious, or cultural wrongs and/or leveraging low institutional trust, are one of the most common forms of state-sponsored disinformation operations. Primarily negative in tone and valence, examples range from Russian Facebook election advertisements in the United States that inflamed racial resentment, China’s disinformation campaign undermining political trust and amplifying partisan discord in Taiwan, and Bahrain’s divisive disinformation promoting conflict between Shi’a and Sunni Muslims.3 These campaigns’ success has been amplified by wide-ranging, concurrent political, social and cultural trends (including populism, economic inequality, migration, low social capital and extreme political polarization) affecting most of the globe.4

Identity-grievance disinformation campaigns capitalize on two interrelated psychological mechanisms: motivated reasoning; and affective polarization. Motivated reasoning is the desire to avoid dissonant information or beliefs and its accompanying affective distress — which in turn leads individuals to bias in their response to information and arguments through processes of selective exposure, attention, comprehension and recall.5 For instance, audiences seek out ideologically consistent information while ignoring, misinterpreting, counter-arguing and/or derogating ideologically incongruent information.6 In this context, emotional desires precede and bias cognitive processing in ways that result in wishful thinking, on the one hand, and stereotyping on the other.7 In sum, our political evaluations, judgements and thinking are mostly a subsequent ‘rationalization’ of how we feel.8

Beliefs are a foundation of identity.9 Claims threatening closely held social identities (political, ethnic, racial and/or religious, etc.) elicit negative emotional responses and colour assessments of new information.10 Likewise, pre-existing negative feelings (such as dislike and distrust, etc.) towards targets of disinformation will increase an audience’s readiness to believe disinformation, while positive feelings increase resistance to disinformation.11

Moreover, to protect a positive image of their self- and social identity, people are often inclined to resist corrective messages that contradict motivated beliefs, by ignoring them or misinterpreting their meaning, arguing against their content and/or derogating their source.12 New information may also make people angry, which has been found to shut down further processing, short-circuit the search for new information and/or lead to strengthening their belief in the disinformation.13

While motivated reasoning may drive belief in disinformation and resistance to correction, our differentiated feelings about groups of people who are different from ourselves is also a key component of identity-grievance disinformation campaigns. Affective polarization refers to the idea that political messaging has the power to amplify the salience of closely held social identities, while increasing prejudice towards ‘out-groups’.14 Group members become ‘polarized’ about how negatively they feel about the other group — with greater negative affect (anger, dislike or disgust) towards the ‘out-group’, increasing the likelihood of adopting and promoting disinformation about it, especially if motivated to cement ‘in-group’ membership.15

In sum, identity-grievance campaigns take advantage of our emotional responses to information in order to persuade, reinforce polarization, promote sharing of disinformation and resist its correction. The largest study to date of online disinformation analysed the spread of 126,000 verified true and false news stories on Twitter between 2006 and 2017. False information ‘diffused significantly farther, faster, deeper and more broadly’ than accurate information — propelled by emotional reactions such as fear and disgust.16 It is thus no surprise that a highly emotional disinformation narrative describing a three-year-old boy crucified by the Ukrainian army is endorsed by a sizeable portion of the targeted population.17

Strategies for Countering Identity-Grievance Campaigns

If emotional biases make us vulnerable to disinformation campaigns that play off our identities and grievances, what type of anti-disinformation strategies should public diplomacy practitioners focus on developing in the forthcoming years? Identity-affirmation is one message strategy employed to counter disinformation that is tied to closely held social identities.18 This strategy positively affirms the identity or values of the audience upon which disinformation preys. This messaging reduces biased processing and promotes positive emotional responses, while at the same time presenting corrective information. This approach, for example, is widely used in efforts to counter disinformation campaigns about climate change and has been suggested as a means to counter Russia’s disinformation war on Ukraine.19 For example, a possible message strategy would be to affirm Russian nationalist identity, while at the same time providing information or claims about the economic costs of Russia’s aggressive policies and intervention in the region.

Alternatively, one could also use a shared identities/values approach that reduces affective polarization between groups.20 By priming a common ‘in-group’ identity — for example, national identity over partisan identity — this type of messaging reduces negative affective polarization towards the ‘out-group’ and thus lessens vulnerability to disinformation. The caveat is that social identities are commonly defined in opposition to an ‘imagined other’.21 Thus, by priming a common ‘in-group’ identity to reduce polarization between two groups — for example, national identity trumping partisan polarization — one may inadvertently create affective polarization towards a third group.22 In addition, identities themselves may be contested, with heterogeneous meaning among group members, further complicating the communication of shared values that reduce negative feelings.

The other component of identity-grievance disinformation campaigns — that is, grievance — may be addressed by solutions-efficacy messaging.23 Research has shown that messages focusing on effective solutions to grievances and promoting the self-efficacy of audiences to enact corrective changes induce positive emotions such as hope, while reducing fear and anger. These positive emotions increase the likelihood of corrective information being accepted and likewise reduce audiences’ vulnerabilities to disinformation.

Efficacy messages have been a successful part of the new digital-platform strategy targeting Russian-speaking audiences by the United States Agency for Global Media called Current Time. One of its programmes, ‘Unknown Russia’, has been widely shared on YouTube and highlights local problems and the actions that ordinary citizens take to address them. This form of solutions-efficacy messaging, optimized for social media platforms and digital sharing, may be a means to create resilience to disinformation among target audiences — as well as build a platform for disseminating corrective messaging.

Information Gaslighting

A second major emerging disinformation strategy is ‘information gaslighting’, or the rapid proliferation of false or misleading information online through social media platforms, blogs, fake news, online comments, advertising and/or online news outlets. Rather than overt persuasion, the primary goal of information flooding is one of ‘strategic distraction’, while sowing uncertainty among its targets.24 Examples include Russia’s disinformation about the so-called ‘little green men’ who rapidly took control of the Crimea in 2014, or China’s large-scale social media operation that fabricates millions of false or misleading posts in order to demobilize its citizens.25

This widespread pollution of the information environment with false or misleading information substantially increases audiences’ difficulty to discern truth from fiction. A successful information gaslighting campaign makes this task: 1) highly costly, if not impossible, for audiences; and 2) creates a sense of loss of control over the information environment.26 Under these conditions, information gaslighting may lead to a feeling of ‘learned helplessness’ among audiences, which subsequently conditions their vulnerability and openness to disinformation.

The premise of learned helplessness is that individuals who experience prolonged exposure to difficult, uncontrollable situations, which they can neither avoid nor alleviate, will ‘learn’ to accept the situation as a given.27 The individual eventually no long attempts to fix or avoid an aversive situation because they believe there is no relationship between action and outcome. Furthermore, learned helplessness may be induced vicariously by watching others experience uncontrollable adverse situations.28

In this sense, information gaslighting leads to a specific form of informational learned helplessness (ILH) that leads to cognitive exhaustion, low motivation, and anxiety when audiences process news and information.29 This opens up audiences to persuasion across several dimensions. First, ILH leads audiences to be less deliberative when evaluating messages and more likely to focus on peripheral cues in the message, rather than the credibility, or strength, of message arguments.30 In addition, anxiety increases vulnerability to negatively framed messages and reduces reliance on pre-existing attitudes or beliefs (such as partisanship) when processing messages.31 The cognitive exhaustion associated with ILH may also potentially reduce reactance and counter-arguing to counter-attitudinal messages.32 Lack of control, such as that associated with learned helplessness, has also been shown to increase belief in conspiracy theories, as individuals are motivated to create structure by seeing illusory patterns and interconnections.33

Those most vulnerable to ILH are individuals with low self-esteem, experiencing depression or anxiety, having a pessimistic explanatory style, or who make incorrect attributions regarding their failure.34 Furthermore, individuals with an internal attribution for their failure to discern false from true messages (for example, ‘I am not good enough’) may experience greater, more constant ILH than individuals who believe an external attribution (such as being targeted by a disinformation campaign) for failure.

Strategies to Counter Gaslighting Disinformation Campaigns

Unfortunately, as a relatively new disinformation tactic, there is not much research examining what may counter ILH stemming from information gaslighting. Research on learned helplessness more generally, however, suggests that the most effective means for public diplomacy campaigns to counter ILH is to provide self-affirmation to audiences on their ability to discern accurate from inaccurate information. This self-affirmation has two important dimensions: attribution failure; and self-efficacy. For instance, if audiences believe that the reason for their inability accurately to discern and evaluate false or misleading disinformation is internal to themselves, then a public diplomacy campaign may focus on building self-confidence among targeted populations that they have the ability and skills to tell truth from fiction. Such a campaign should also simultaneously highlight the external factors/conditions that negatively influence their information environment, paired with suggested solutions that build their self-efficacy for changing such conditions.

A second strategy is ‘information-discernment education’ as a means to improve information literacy.35 Information literacy intervention programmes positively affect the cognitive and emotional states of audiences and increase their ability to evaluate information critically.36 This is the approach being taken by the Italian government to counter disinformation programmes, by creating a special curriculum on identifying online disinformation for high-school students.37 IREX also conducted a similar programme in the Ukraine on behalf of the Canadian government in 2015 and 2016.38 IREX’s ‘Learn to Discern’ media literacy programme directly trained 15,000 youth and adults in Ukraine on basic media literacy skills and had secondary impacts on another 90,000. Participants in the programme were 25 per cent more likely to say that they check multiple news sources and 13 per cent more likely to identify correctly and analyse disinformation critically.

Incidental Exposure to Disinformation

State-sponsored disinformation campaigns may simply be banal, in that they focus on increasing foreign audience’s everyday incidental exposure to false or misleading information through international state-controlled broadcast media, online news websites, blogs, or social media. Both Russia and China in recent years have substantially increased the reach of their international broadcasting that targets foreign audiences, while simultaneously developing new online news portals, such as Russia’s Sputnik.39 Iran has also launched several international broadcasting channels and online news websites, targeting foreign audiences in multiple languages, as platforms for disinformation.40

Incidental exposure to disinformation does not necessarily rely on direct exposure to state-sponsored media platforms. For example, analysis shows that most major American news outlets used information in their reporting that was originally generated by Twitter bot accounts controlled by Russia’s online disinformation organization, Russian Internet Research Agency.41 In turn, this banal exposure to disinformation may lead unmotivated audiences to accept and internalize the disinformation. This exposure, and acceptance, of disinformation from low-credible sources is abetted by a widespread decline of confidence in credible media organizations in many democratic countries, leading audiences to rely on alternative information sources.42

Beliefs are not always tied to deeply held views, and individuals do not always have deep commitments to identities or grievances. Instead, common information-processing shortcuts, or heuristics, upon which we rely influence which beliefs we adopt.43 For example, audiences are likely to accept disinformation, even disinformation with low believability, to which they are more frequently exposed. Frequent exposure increases the fluency, familiarity and accessibility of disinformation in memory.44 This accessibility and familiarity also potentially creates ‘sleeper effects’, where audiences recall and believe the false information but do not remember the credibility of the original source.45 Disinformation, especially in simple narrative formats, may also fill explanatory gaps for audiences struggling to make sense of a complex world and ‘feel’ their way through complex situations.46

Moreover, cognitive biases also make disinformation difficult to correct. Informing audiences, for instance, that their beliefs are incorrect and that they need to update them may trigger cognitive and affective reactance, leading them to hold on to those beliefs more firmly.47 The ‘continued influence effect’ is another bias that may lead individuals to maintain a false belief, despite accepting that it is based upon false information.48 This continued influence of disinformation is primarily because of our inherent need to maintain easily accessible mental cause-and-effect explanations (models) that help us to make sense of a complex world. Without new information or alternative explanations that fill the ‘causal gap’, the false belief will persist and continue to shape thinking.49

In sum, audiences need not be emotionally motivated by identity or grievance, or cognitively exhausted, to believe disinformation. Rather, simple repeated exposure from even low-credible sources may lead to the assimilation of false or misleading information by large segments of the population because of common cognitive biases in how we process information. Once assimilated, these same cognitive biases make disinformation, and consequential attitudes, difficult to dislodge from the minds of audiences.

Strategies for Countering Incidental Exposure to Disinformation

Among unmotivated audiences, the same cognitive biases and heuristics that lead to assimilating disinformation in the first place can also be used to good effect to counter it. One strategy is repeated exposure to corrective information that supplants the disinformation in the minds of targets.50 Two key elements, however, of this strategy are: to avoid any repetition of the false or misleading information in your messaging and solely focus on the correct information in your communications with audiences; and to provide audiences with a pre-warning that they may be targets of disinformation through incidental exposure.51

A second strategy is communicating corrective information in clear and easily digestible compelling formats that increase ‘information fluency’, are more easily accessible in memory and less likely to induce reactance.52 For example, one study found that presenting corrective messages in graphic formats was highly effective in correcting false beliefs about a range of issues ranging from the troop surge in Iraq to climate change.53

Creating and presenting alternative narratives or explanations is a strategy that targets audiences vulnerable to the aforementioned ‘continued influence’ effect.54 The goal is to provide an alternative narrative that ‘fills the gap’ left in audience’s mental models when correcting disinformation. This alternative narrative needs to be heavily repeated, without reinforcing the original disinformation, to ensure that it is assimilated. In general, using narrative approaches to corrective messaging will reduce counter-arguing and reactance and be more persuasive than other message formats.55

A fourth strategy is to promote more careful judgement and deliberation by audiences about the information that they consume.56 In fact, some scholars argue that vulnerability to disinformation is more a function of a lack of deliberation than wilful ignorance.57 Thus, increasing audiences’ motivation to engage in analytical thinking and increasing their scepticism about low-credible news sources reduces their susceptibility to disinformation.

A related strategy to increased deliberation is training and education for audiences to promote news media literacy (NML). NML is different from classical media literacy training, as it focuses on enhancing ‘the knowledge, skills and beliefs needed to understand how news content is produced, how people make consumption choices, and how to evaluate the quality and veracity of news content’.58 This approach is not limited to classroom settings, and NML should be deployed as part of public communication campaigns that are designed to educate mass audiences on how to be more deliberative, sophisticated and aware news’ consumers.

Future Public Diplomacy Technology to Counter Disinformation

In sum, we have reviewed three common types of state-sponsored disinformation campaigns and the psychological mechanisms that explain their influence on target audiences. In turn, by understanding the motivated and unmotivated psychological biases that lead disinformation to be so successful, we at the same time have a better understanding of how to employ strategic communication and technology to counter it. In many cases, existing public diplomacy campaigns and efforts already include many of these suggested strategies and elements.

However, additional strategic thinking is required on how to leverage and align future public diplomacy campaigns to counter state-sponsored disinformation campaigns. These disparate efforts, furthermore, should be used in combination with each other. Although the disinformation campaign archetypes that we have discussed are presented as distinct strategies, in practice they are often combined to target a wide range of audiences and build off each other, such as by combining identity grievance and information gaslighting campaigns to flood the information environment with a high volume of emotionally laden disinformation. While some individuals would be motivated by anger, dislike and disgust to accept, and act upon, the disinformation, others may simply be cognitively exhausted and give up attempting to tell truth from fiction. In this situation, multiple counter-strategies should be employed — for instance, combining identity-affirming messaging with self-affirmation, or solutions-efficacy with self-efficacy, in combination with inoculation messaging, etc.

Furthermore, although technology use may contribute to disinformation campaigns, it does not preclude using our knowledge about the psychology of disinformation to inform the development of technological tools for countering disinformation. This approach of combining psychology with technology to counter disinformation is labelled ‘technocognition’.59 From a public diplomacy perspective, this aligns with calls for public diplomacy to do a better mission of incorporating digital engagement tools and the affordances they offer into educational diplomacy.60

For example, Kelly Garrett argues that technological developments such as social media have amplified the potential for disinformation campaigns to take advantage of affective polarization and use ‘emotional extremity for strategic effect’.61 Garrett argues that a possible solution is to promote the development of emotional dampening tools such as new online deliberative platforms or apps for discussing controversial topics among vulnerable audiences. Likewise, he also suggests creating tools to promote civility in online comment spaces and news media platforms and/or designing filters for people that target the emotional content of online messages.

Technology can help audiences deal with information gaslighting campaigns and navigate a flood of disinformation that makes them feel helpless. Examples would be online or social-media games such as Post-Facto, Bad News, or The News Hero that teach online fact-checking skills or the basic design principles of disinformation campaigns. Playing Bad News, created by the Dutch group DROG, had a ‘positive effect on students’ ability to recognize and resist fake news and propaganda’.62 Based on the theory of inoculation, research has shown that this strategy of exposing audiences to how disinformation works increases their ability, and importantly their active motivation, to identify and reject disinformation.63 Likewise, technology can teach audiences news-media literacy and promote analytical thinking as a means to counter incidental exposure to disinformation. The new partnership between Factcheck.org and iCivics, an educational non-profit, is an example. They are collaborating to deploy an educational game to teach secondary school students and adults news-media literacy skills that would allow them to evaluate news sources and the information provided more critically.64

Beyond tools to dampen emotional online content and educational games, technology can enhance an important tool of educational pubic diplomacy: professional training and study tours. Successfully countering state-sponsored campaigns and making the public resilient to their influence requires training very large numbers of foreign independent journalists and media organizations, opinion leaders, civil-society organizations and allied diplomats on how to identify, verify and counter disinformation globally. However, even the largest state-sponsored cultural and educational exchange programmes only have a few thousand participants globally each year at best.

The public diplomacy community thus needs to rethink how best to use its scarce resources for these programmes to take advantage of online learning technology and virtual exchanges.65 The Police Professionalization Exchange Program sponsored by the US State Department and administered by the educational exchange non-profit Global Ties US is one such example.66 A significant component of this programme is providing dozens of hours of professional training on law-enforcement best practices for thousands of police officers. This highlights the potential for large-scale virtual ‘exchanges’ and online training platforms to provide a much greater reach to large numbers of foreign professionals and opinion leaders — especially those outside of capital regions and/or in local communities. In terms of disinformation, this provides an opportunity to devise online training programmes provided in native languages on countering disinformation for foreign journalists, civil-society organizations, opinion leaders and allied diplomats — professional training that is sorely needed in many countries.67

Conclusion

Although public diplomacy corrective message campaigns and technology development are necessary, they may not be sufficient to counter state-sponsored disinformation campaigns targeting foreign or domestic audiences. Public diplomacy also has to address the political and social conditions that allow disinformation to flourish, such as the loss of confidence in democratic institutions and the rise of anti-democratic movements in established and emerging democracies.68 The future of public diplomacy thus needs to go beyond day-to-day efforts to counter disinformation campaigns and develop new online tools. Public diplomacy also needs to reinvest in public diplomacy programmes that address the root causes of audiences believing and sharing disinformation. These are the classic public diplomacy programmes that strengthen democratic institutions, further human rights, fight inequality and promote self-efficacy among audiences to make positive change. This also includes investment in more theory-driven formative and summative evaluation research that draws upon the available social–psychological and communication scholarship on misinformation and false beliefs and applies it to public diplomacy contexts. These are long-term goals and outcomes for public diplomacy, but they are equally important for effectively countering the influence of state-sponsored disinformation on foreign and domestic audiences.

Erik C. Nisbet

is Associate Professor at the Ohio State University (OSU) School of Communication and Political Science (by courtesy). He is a faculty affiliate with the OSU Mershon Center for International Security Studies, where he is Co-Director of the Eurasian Security and Governance programme. Nisbet is also a non-residential faculty fellow at the University of Southern California’s Center on Public Diplomacy.

Olga Kamenchuk

is a Research Associate at the Ohio State University’s Mershon Center for International Security Studies and Co-Director of Mershon’s Eurasian Security and Governance programme. Kamenchuk is also Associate Professor (clinical) in the OSU School of Communication and Slavic and Eastern European Languages and Cultures (by courtesy).

1

R.K. Garrett, ‘The “Echo Chamber” Distraction: Disinformation Campaigns are the Problem, Not Audience Fragmentation”, Journal of Applied Research in Memory and Cognition, vol. 6, no. 4 (2017), pp. 370-376.

2

S. Lewandowsky, U.K.H. Ecker and J. Cook, ‘Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era’, Journal of Applied Research in Memory and Cognition, vol. 6, no. 4 (2017), pp. 353-369; and S. Lewandowsky, U.K.H. Ecker, C.M. Seifert, N. Schwarz and J. Cook, ‘Misinformation and its Correction: Continued Influence and Successful Debiasing’, Psychological Science in the Public Interest, vol. 13, no. 3 (2012), pp. 106-131.

3

M.J. Cole, ‘Will China’s Disinformation War Destabilize Taiwan?’, The National Interest (30 July 2017), retrieved from http://nationalinterest.org/feature/will-chinas-disinformation-war-destabilize-taiwan-21708; N. Penzenstadler, B. Health and J. Guynn. ‘We Read Every One of the 3,517 Facebook Ads Bought by Russians: Here’s What We Found’, USA Today (11 May 2018), retrieved from https://www.usatoday.com/story/news/2018/05/11/what-we-found-facebook-ads-russians-accused-election-meddling/602319002/; S. Kelly, M. Truong, N. Shahbaz, M. Earp and J. Whit, Freedom on the Net, 2017: Manipulating Social Media to Undermine Democracy, Freedom of the Net project (Washington, DC: Freedom House, November 2017), retrieved from https://freedomhouse.org/sites/default/files/FOTN_2017_Final.pdf.

4

Lewandowsky, Ecker and Cook, ‘Beyond Misinformation’.

5

M. Lodge and C.S. Taber, The Rationalizing Voter (Cambridge: Cambridge University Press, 2013).

6

Lewandowsky, Ecker, Seifert, Schwarz and Cook, ‘Misinformation and its Correction’; and Lodge and Taber, The Rationalizing Voter.

7

J. Haidt, The Righteous Mind: Why Good People Are Divided by Politics and Religion (New York, NY: Vintage, 2013); N. Haslam, ‘Dehumanization: An Integrative Review’, Personality and Social Psychology Review, vol. 10, no. 3 (2006), pp. 252-264; and D. Kahneman, Thinking, Fast and Slow (New York, NY: Farrar, Straus and Giroux, 2011).

8

R.K. Garrett and M. Jeong, ‘From Partisan Media to Misperception: Affective Polarization as Mediator’, paper presented at the Annual meeting of the International Communication Association, San Diego (2017); and Lodge and Taber, The Rationalizing Voter.

9

G.L. Cohen, J. Aronson and C.M. Steele, ‘When Beliefs Yield to Evidence: Reducing Biased Evaluation by Affirming the Self’, Personality and Social Psychology Bulletin, vol. 26, no. 9 (2000), pp. 1151-1164.

10

Lodge and Taber, The Rationalizing Voter.

11

D.J. Flynn, B. Nyhan and J. Reifler, ‘The Nature and Origins of Misperceptions: Understanding False and Unsupported Beliefs about Politics’, Advances in Political Psychology, vol. 38, suppl. 1 (2017), pp. 127-150.

12

B.T. Johnson and A.H. Eagly, ‘Effects of Involvement on Persuasion: A Meta-Analysis’, Psychological Bulletin, vol. 106, no. 2 (September 1989), pp 290-314; Lewandowsky et al., ‘Misinformation and its Correction’; and E.C. Nisbet, K. Cooper and R.K. Garrett, ‘The Partisan Brain: How Dissonant Science Messages Lead Conservatives and Liberals to (Dis)trust Science’, Annals of the Academy of Political and Social Science, vol. 658, no. 1 (2015), pp. 36-66

13

E. Halperin, A.G. Russell, C.S. Dweck and J.J. Gross, ‘Anger, Hatred, and the Quest for Peace: Anger can be Constructive in the Absence of Hatred’, Journal of Conflict Resolution, vol. 55 (2011), pp. 274-291; P.S. Hart and E.C. Nisbet, ‘Boomerang Effects in Science Communication: How Motivated Reasoning and Identity Cues Amplify Opinion Polarization about Climate Mitigation Policies’, Communication Research, vol. 39, no. 6 (2012), pp. 701-723; and B. Weeks, ‘Emotions, Partisanship, Misperceptions: How Anger and Anxiety Moderate the Effect of Partisan Bias on Susceptibility to Political Misinformation’, Journal of Communication, vol. 65, no. 4 (2015), pp. 699-719.

14

S. Iyengar, G. Sood and Y. Lelkes, ‘Affect, Not Ideology: A Social Identity Perspective on Polarization’, Public Opinion Quarterly, vol. 76, no. 3 (2012), pp. 405-431, doi:10.1093/poq/nfs038; and M. Wojcieszak and R.K. Garrett ‘Social Identity, Selective Exposure, and Affective Polarization: How Priming National Identity Shapes Attitudes toward Immigrants via News Selection’, Human Communication Research, vol. 44, no. 3 (2018), pp. 247-273, doi: 10.1093/hcr/hqx010.

15

D.M. Kahan, ‘Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study’, Judgment and Decision Making, vol. 8 (2013), pp. 407-424; and Garrett and Jeong, ‘From Partisan Media to Misperception’.

16

S. Vosoughi, D. Roy and S. Aral, ‘The Spread of True and False Information News Online’, Science, vol. 359 (2018), pp. 1146-1151.

17

A. Nemtsova, ‘There’s No Evidence the Ukrainian Army Crucified a Child in Slovyansk’, The Daily Beast (15 July 2014), retrieved from https://www.thedailybeast.com/theres-no-evidence-the-ukrainian-army-crucified-a-child-in-slovyansk.

18

D.M. Kahan, ‘Fixing the Communications Failure’, Nature, vol. 463 (2010), pp. 296-297; and Lewandowsky, Ecker, Seifert, Schwarz and Cook, ‘Misinformation and its Correction’.

19

R.K. Garrett, ‘Strategies for Countering False Information and Beliefs about Climate Change’, in M.C. Nisbet, M. Schafer, E. Markowitz, S. Ho, S. O’Neill and J. Thaker (eds), Oxford Research Encyclopedia of Climate Science (Oxford: Oxford University Press, 2018); Lewandowsky, Ecker, Seifert, Schwarz and Cook, ‘Misinformation and its Correction’; E. Stoycheff and E.C. Nisbet, ‘Priming the Costs of Conflict? Russian Public Opinion about the 2014 Crimean Conflict’, International Journal of Public Opinion Research, vol. 4, no. 1 (2017), pp. 657-675.

20

M.S. Levendusky, ‘Americans, Not Partisans: Can Priming American National Identity Reduce Affective Polarization?’, Journal of Politics, vol. 80, no. 1 (2017), pp. 59-69.

21

J.R. Bowen, ‘Anti-Americanism as Schemas and Diacritics in France and Indonesia’, in P. Katzenstein and R. Keohane (eds), Anti-Americanisms in World Politics (Ithaca, NY: Cornell University Press, 2006), pp. 227-250.

22

Wojcieszak and Garrett, ‘Social Identity, Selective Exposure, and Affective Polarization’.

23

A. Curry, N.J. Stroud and S. McGregor, Solutions Journalism and News Engagement (Austin, TX: University of Texas at Austin Center for Media Engagement, 2016), retrieved from https://mediaengagement.org/research/solutions-journalism-news-engagement; L. Feldman and P.S. Hart, ‘Is There Any Hope? How Climate Change News Imagery and Text Influence Audience Emotions and Support for Climate Mitigation Policies’, Risk Analysis, vol. 38, no. 3 (2018), pp. 585-602; K. McIntyre, ‘Solutions Journalism: The Effects of Including Solution Information in News Stories about Social Problems’, Journalism Practice (online, 14 December 2017).

24

G. King, J. Pan and M.E. Roberts, ‘How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument’, American Political Science Review, vol. 111, no. 3 (2017), pp. 484-501; C. Paul and M. Matthews, The Russian ‘Firehouse of Falsehood’ Propaganda Model: Why it Might Work and Options to Counter It (Santa Monica, CA: RAND Corporation, 2016), retrieved from https://www.rand.org/pubs/perspectives/PE198.html.

25

King, Pan and Roberts, ‘How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument’; S. Pifer, Watch Out for Little Green Men (Washington, DC: Brookings Institute, 7 July 2014), retrieved from https://www.brookings.edu/opinions/watch-out-for-little-green-men/.

26

King, Pan and Roberts, How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument’.

27

S.F Maier and M.E.P. Seligman, ‘Learned Helplessness: Theory and Evidence’, Journal of Experimental Psychology: General, vol. 105, no. 1 (1976), pp 3-46.

28

I. Brown and D.K. Inouye, ‘Learned Helplessness through Modeling: The Role of Perceived Similarity in Competence’, Journal of Personality and Social Psychology, vol. 36, no. 8 (1978), pp. 900-908.

29

L.Y. Abramson, M.E Seligman and J.D. Teasdale, ‘Learned Helplessness in Humans: Critique and Reformulation’, Journal of Abnormal Psychology, vol. 87, no. 1 (1978), pp. 49-74; Maier and Seligman, ‘Learned Helplessness’; G. Sedek and M. Kofta, ‘When Cognitive Exertion Does Not Yield Cognitive Gain: Toward an Informational Explanation of Learned Helplessness’, Journal of Personality and Social Psychology, vol. 58, no. 4 (1990), pp. 729-743; Y. Amichai-Hamburgerm, M. Mikkulincer and N. Zalts, ‘The Effects of Learned Helplessness on Processing of a Persuasive Message’, Current Psychology: Developmental, Learning, Personality, Social, vol. 22, no. 1 (2003), pp. 37-46; M. Gasiorowska, ‘The Effects of Learned Helplessness and Message Framing on the Processing of Verbal and Visual Information in Advertisements’, in T. Marek, W. Karwoski, M. Frankowicz, J. Kantola and P. Zgaga (eds), Human Factors of a Global Society: A System of Systems Perspective (Boca Raton, FL: CRC Press, 2014), pp. 379-394.

30

Amichai-Hamburgerm, Mikkulincer and Zalts, ‘The Effects of Learned Helplessness on Processing of a Persuasive Message’; Gasiorowska, The Effects of Learned Helplessness and Message Framing on the Processing of Verbal and Visual Information in Advertisements’.

31

Gasiorowska, ‘The Effects of Learned Helplessness and Message Framing on the Processing of Verbal and Visual Information in Advertisements’; Weeks, ‘Emotions, Partisanship, Misperceptions’; T. Brader, ‘Striking a Responsive Chord: How Political Ads Motivate and Persuade Voters by Appealing to Emotions’, American Journal of Political Science, vol. 49 (2005), pp. 388-405, doi:10.1111/j.0092-5853.2005.00130.x; T. Brader, ‘The Political Relevance of Emotions: “Reassessing” Revisited’, Political Psychology, vol. 32 (2011), pp. 337-346, doi:10.1111/j.1467-9221.2010.00803.x.

32

J.P. Dillard and Shen Lijiang, ‘On the Nature of Reactance and its Role in Persuasive Health Communication’, Communication Monographs, vol. 72, no. 2 (2005), pp. 144-168; J.Z. Jacks and K.A. Cameron, ‘Strategies for Resisting Persuasion’, Basic and Applied Social Psychology, vol. 25, no. 2 (2003, pp. 145-161.

33

J.A. Whitson and A.D. Galinsky, ‘Lacking Control Increases Illusory Pattern Perception’, Science, vol. 322, no. 3 (2008), pp. 115-117.

34

C. Peterson, S.F. Maier and M.E.P. Seligman, Learned Helplessness: A Theory for the Age of Personal Control (New York, NY: Oxford University Press, 1995).

35

S. Lewandowsky, U.K.H. Ecker and J. Cook, ‘Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era’, Journal of Applied Research in Memory and Cognition, vol. 6, no. 4 (2017), pp. 353-369.

36

G. Walton and M. Hepworth, ‘A Longitudinal Study of Changes in Learners’ Cognitive States During and Following an Information Literacy Teaching Intervention’, Journal of Documentation, vol. 67 (2011), pp. 449-479; E. Vraga and L. Boyd, ‘Leveraging Institutions, Educators, and Networks to Correct Misinformation: A Commentary on Lewandowsky, Ecker, and Cook’, Journal of Applied Research in Memory and Cognition, vol. 6, no. 4 (2017), pp. 382-388.

37

J. Horowitz, ‘In Italian Schools, Reading, Writing and Recognizing Fake News’, New York Times (18 October 2017), p. A1, retrieved from https://www.nytimes.com/2017/10/18/world/europe/italy-fake-news.html.

38

E Murrock, J. Amulya, M. Druckman and T. Liubyva, Winning the War on State-sponsored Propaganda: Gains in the Ability to Detect Disinformation a Year and a Half after Completing a Ukrainian News Media Literacy Program (Washington, DC: IREX, 2018), retrieved from https://www.irex.org/sites/default/files/node/resource/impact-study-media-literacy-ukraine.pdf.

39

G. Rawnsley, ‘To Know Us Is To Love Us: Public Diplomacy and International Broadcasting in Contemporary Russia and China’, Politics, vol. 35, nos. 3/4 (2015), pp. 273-286; I. Yablokov, ‘Conspiracy Theories as Russian Public Diplomacy Tool: The Case of Russia Today (RT)’, Politics, vol. 35, nos. 3/4 (2015), pp. 301-315.

40

E. Wastnidge, ‘The Modalities of Iranian Soft Power: From Cultural Diplomacy to Soft War’, Politics, vol. 35, nos. 3/4 (2017), pp. 364-377.

41

J. Lukito and C. Wells, ‘Most Major Outlets have Used Russian Tweets as Sources for Partisan Opinion: Study’, Columbia Journalism Review (8 March 2018), retrieved from https://www.cjr.org/analysis/tweets-russia-news.php.

42

W.L. Bennett and S. Livingston, ‘The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions’, European Journal of Communication, vol. 33, no. 2 (2018), pp. 122-139.

43

R. McDermott, Political Psychology in International Relations (Ann Arbor, MI: University of Michigan Press, 2004); Y.Y.I. Vertzberger, The World in Their Minds: Information Processing, Cognition, and Perception in Foreign Policy Decision-making (Stanford, CA: Stanford University Press, 1990).

44

G. Pennycook, T.D. Cannon and D.G. Rand, ‘Prior Exposure Increases Perceived Accuracy of Fake News’, Journal of Experimental Psychology: General (forthcoming 2019); M.S. Ayers and L.M. Reder, ‘A Theoretical Review of the Misinformation Effect: Predictions from an Activation-based Memory Model’, Psychonomic Bulletin & Review, vol. 5, no. 1 (1998), pp. 1-21; and N. Schwarz, J. Sanna, I. Skurnik and C. Yoon, ‘Metacognitive Experiences and the Intricacies of Setting People Straight: Implications for Debiasing and Public Information Campaigns’, in M.P. Zanna (ed.), Advances in Experimental Social Psychology, vol. 39 (Cambridge, MA: Academic Press, 2007), pp. 127-161.

45

G.T. Kumkale and D. Albarracin, ‘The Sleeper Effect in Persuasion: A Meta-Analytic Review’, Psychological Bulletin, vol. 130, no. 1 (2004), pp. 143-172.

46

M.F. Dahlstrom, ‘Using Narratives and Storytelling to Communicate Science with Non-Expert Audiences’, Proceedings of the National Academy of Sciences, vol. 111, suppl. 4 (2014), pp. 13614-13620; R.K. Garret, E.C. Nisbet and E. Lynch, ‘Undermining the Corrective Effects of Media-based Political Fact Checking? The Role of Contextual Cues and Naïve Theory’, Journal of Communication, vol. 63, no. 4 (2013), pp. 617-637.

47

S.S. Brehm and J.W. Brehm, Psychological Reactance: A Theory of Freedom and Control (New York, NY: Academic Press, 2013).

48

C.M. Seifert, ‘The Continued Influence of Misinformation in Memory: What Makes a Correction Effective?’, in H.R. Brian (ed.), Psychology of Learning and Motivation, vol. 41 (Cambridge, MA: Academic Press, 2002), pp. 265-292.

49

Garrett, ‘Strategies for Countering False Information and Beliefs about Climate Change’.

50

Schwarz, Sanna, Skurnik and Yoon, ‘Metacognitive Experiences and the Intricacies of Setting People Straight’.

51

Lewandowsky, Ecker, Seifert, Schwarz and Cook, ‘Misinformation and its Correction’.

52

P. Winkielman, D.E. Huber, L. Kavanagh and N. Schwarz, ‘Fluency of Consistency: When Thoughts Fit Nicely and Flow Smoothly’, in B. Gawronski and F. Strack (eds), Cognitive Consistency: A Unifying Concept in Social Psychology (New York, NY: Guilford Press, 2012), pp. 89-111.

53

B. Nyhan and J. Reifler, ‘The Roles of Information Deficits and Identity Threat in the Prevalence of Misperceptions’, Journal of Elections, Public Opinion and Parties (online 6 May 2018).

54

Lewandowsky, Ecker, Seifert, Schwarz and Cook, ‘Misinformation and its Correction’; and Seifert, ‘The Continued Influence of Misinformation in Memory’.

55

Dahlstrom, ‘Using Narratives and Storytelling to Communicate Science with Non-Expert Audiences.’

56

R.K. Herrmann and J.K. Choi, ‘From Prediction to Learning: Opening Experts’ Minds to Unfolding History’, International Security, vol. 31 (2007), pp. 132-161; Lewandowsky, Ecker, Seifert, Schwarz and Cook, ‘Misinformation and its Correction’; P. Tetlock, Expert Political Judgment: How Good Is It? How Can We Know? (Princeton, NJ: Princeton University Press, 2005).

57

G. Pennycook and D.G. Rand, ‘Susceptibility to Partisan Fake News is Explained More by a Lack of Deliberation than by Willful Ignorance’, SSRN Working Paper (19 April 2018), retrieved from http://dx.doi.org/10.2139/ssrn.3165567.

58

Vraga and Boyd, ‘Leveraging Institutions, Educators, and Networks to Correct Misinformation’, p. 383.

59

Lewandowsky, Ecker and Cook, ‘Beyond Misinformation’.

60

C. Hayden, ‘Technology Platforms for Public Diplomacy: Affordances for Education’, in J. Mathews-Aydinli (ed.), International Education Exchanges and Intercultural Understanding (New York, NY: Palgrave Macmillan, 2017).

61

Garrett, ‘The “Echo Chamber” Distraction’, p. 373.

62

J. Roozenbeek and S. Van der linden, ‘The Fake News Game: Actively Inoculating Against the Risk of Misinformation’, Journal of Risk Research (online 2018), retrieved from https://www.tandfonline.com/doi/full/10.1080/13669877.2018.1443491.

63

J. Cook, S. Lewandowsky and U.K.H. Ecker, ‘Neutralizing Misinformation through Inoculation: Exposing Misleading Argumentation Techniques Reduces their Influence’, PLoS ONE, vol. 12, no. 5 (2017), e0175799, http://dx.doi.org/10.1371/journal.pone.0175799; Roozenbeek and Van der linden, ‘The Fake News Game’.

64

See ‘APPC and iCivics Team Up on Game to Teach Media Literacy’, retrieved from https://www.annenbergpublicpolicycenter.org/appc-partners-with-icivics-to-create-game-to-teach-media-literacy/.

65

Hayden, ‘Technology Platforms for Public Diplomacy’.

67

IREX, ‘Media Sustainability Index: The Development of Sustainable Independent Media in Europe and Eurasia’ (Washington, DC: IREX, 2017), retrieved from https://www.irex.org/sites/default/files/pdf/media-sustainability-index-europe-eurasia-2017-full.pdf.

68

Bennett and Livingston, ‘The Disinformation Order’; Lewandowsky, Ecker and Cook, ‘Beyond Misinformation’.

Content Metrics

All Time Past Year Past 30 Days
Abstract Views 8331 931 0
Full Text Views 1854 1048 84
PDF Views & Downloads 1809 632 100