Digital Influence Engineering
This research briefing summarises research on the science of influence to show how various techniques are employed in digital communications environments. There is a particular focus upon the role of uncertain or contested ‘soft facts’ and the ways these are deployed to shape public attitudes and behaviour.
Influence science
The disruptive and transformative societal impacts of social media platforms and how they influence the ways we think, feel and behave is an increasingly important public policy issue. Reflecting a need to understand these developments, there has been a recent rapid growth in the science of influence.
Evidence and insights derived from a range of academic disciplines, including social psychology, political science, communication studies, behavioural economics, sociology and data science, are improving our understanding of how and why influence happens in digital spaces.
This briefing sets out how ‘digital influence engineering’ is conducted and some of the key techniques used when communicating high volumes of information to foster more extreme viewpoints, undermine institutions and mobilise us to act in socially unacceptable or criminal ways, both online and offline in the ‘real world’. It seeks to synthesize emergent state-of-the-art knowledge, with a particular focus upon the impact of ‘hard’ and ‘soft’ facts.
What makes a communication persuasive?
What makes a message persuasive to a digital, online audience shares many common features with communications and interactions that occur offline, yet these can take on increased significance when they interact with technological affordances associated with new technologies.
‘Real-world’ distinctions between the source of a message and its recipient are more blurred in a digital communication environment. So too are the boundaries between information that: (1) represents a verifiable ‘truth’; and, (2) information that to varying degrees either deliberately (‘disinformation’) or unintentionally (‘misinformation’) misleads.
We distinguish these forms of information as ‘hard’ and ‘soft’ facts:
- A ‘hard fact’ is valid and reliable, and often has credible backing. It is routinely communicated as a secure, objective standard of evidence. Examples might include official statistics or legislation.
- By contrast, a ‘soft fact’ involves imperfect knowledge whose status may be ambiguous, uncertain, contested and/or malleable. Soft facts typically thrive in an information vacuum. Circulating without being firmly verified or authenticated, soft facts help to ‘fill in the gaps’ of public understanding of an event or situation, thus meeting some emotional or intellectual need.
Species of soft fact
There are four conceptually distinct, but related, species of soft fact that are prevalent and thrive in the contemporary media ecology: rumours, conspiracy theories, propaganda and fake news.
Rumours
Rumours are statements or propositions for a belief that possess topical or contemporary relevance (Allport & Postman, 1947).
Also referred to as ‘improvised news’, rumours allow people to try and construct a meaningful, shared interpretation of a situation they perceive to be inadequately defined (Shibutani,1966). Rumours are spread through peoples’ collective efforts to make sense of a situation, hence they are inherently ‘open’ and provisional forms of knowledge.
The expression and spread of rumours are facilitated by the quick, informal information exchange characteristic of the digital information environment.
Conspiracy Theories
Conspiracy Theories are more traditional, recurring and repetitive accounts or stories.
They offer alternative ‘sensemaking’ explanations of reality pertaining to a historical event or practice. A conspiratorial explanation attributes a significant causal effect to the secretive and collaborative actions of a vast, powerful and insidious ‘other’ whose attempt to deceive or conceal their role is only now being revealed and countered.
A notorious, high-profile and long-lasting conspiracy theory relates to JFK’s assassination in 1963. That the theory Lee Harvey Oswald did not act alone in killing Kennedy has endured for fifty years is testament to two key ingredients: (1) power – a presidential victim and a large pool of potential villains of immense reach and influence across the political spectrum and (2) ‘proof’ that the government investigation at the time was overly hasty. It exemplifies just how difficult it can be to falsify a conspiracy theory with counter-evidence, given that official accounts are construed as part of the problem and the phenomenon itself is conceptually complex (Uscinski & Parent, 2013).
The internet’s effect on the spread and popularity of conspiracy theories is equivocal: it provides an ‘echo-chamber’ that amplifies opposing viewpoints, but also, a good proportion of online activity puts effort into de-bunking or mocking conspiracy theories.
Propaganda
Propaganda is a deliberate, systematic attempt to shape perceptions, manipulate thinking and direct behaviour to achieve a response furthering the propogandist’s intent (Jowell & O’Donnell, 1999).
In the digital world, computational algorithms and automated processes facilitate the purposeful distribution of misleading information over social networks. So-called ‘computational propoganda’ flourished during the 2016 Presidential elections in the USA with the use of bots to distribute political messages on social media (Bolsover & Howard, 2017).
Fake news
Fake news can be construed as a fourth species of soft fact, albeit one that overlaps with the other three and is associated in contemporary public consciousness with the 2016 USA elections and the Trump presidency (Allcot & Gentzkow, 2017).
When a source attributes the label ‘fake news’ to the content of mass media stories or narratives, it is a way of publicly contesting content that the source doesn’t like. Whilst the fake news attribution is deliberate, it is not always malign in its intent or politically motivated.
Fake news differs from ‘false news’ which can be objectively verified as a falsehood and so represents a ‘hard fact’, albeit one that Vosoughti et al. (2018) demonstrate travels farther, faster and deeper and more broadly than the truth.
I Am Faster
To summarise the diverse evidence-base, spread across different academic disciplines, with their own terminologies and emphases, we developed the ‘I AM FASTER’ mnemonic.
This captures the unique qualities of the digital communication environment; quick, global, high volume transfers of information to audiences who are themselves active agents in responding to, and interacting with, its content.Thus, new communication technologies and soft facts have fundamentally changed what and how we ‘know’, and how we are influenced.
The key dynamics are:
I – n Group (vs Out – group) ideology
Soft facts are ‘sense-making’ tools for people sympathetic, or aligned to, the belief systems of specific causes or groups.
Social media facilitates links between like-minded people with common interests or values (an ‘in-group’) who tend to consume soft facts supporting their pre-existing views, whilst actively rejecting those (of the ‘out-group’) that do not. Aided by the anonymity that online presence affords, digital communication environments help create, and sustain collective identities and divisions demographically, politically, and institutionally - tunnelling users towards information and groups that confirm and reinforce their existing thinking.
By amplifying and simplifying the social processes of categorising others that occurs offline, digital spaces can be highly attractive to vulnerable individuals looking for quick, clear answers and meaning.
Attempts may be made to dissuade group members from interacting online and offline with ‘the other’, promoting alienation from moderating influences on their thinking and behaviour. This is something McLuhan (1964) anticipated and labelled ‘re-tribalization’.
A - udience Segmentation and Targeting
Selective exposure to soft facts happens online because audiences differ in their information consumption patterns (what they view, where and when). This may occur because of geography, pre-existing proclivities and social networks, but more controversially, it is reinforced by predictive algorithms which target audiences with information coherent with their views.
The 2016 Trump online electoral campaign is an example of how information was purposively designed and delivered to promote voting behaviour among a pro-Trump audience and supress Trump’s opponents (Halpern, 2017).
Delivering ‘successful’ communication that changes what people do therefore involves understanding how different soft facts will resonate with different audiences.
M - essenger and Message
Who presents a message (‘messenger’), along with its content (‘message’), affects the speed and spread of soft facts online.
Digital communications minimise ‘status cues’ associated with physical appearance and social hierarchy of the message source or messenger, hence social influence online can derive more from what is written or spoken. Verbal and non-verbal elements of ‘speech acts’ have been identified (Maynard & Benesch, 2016) that are important in expressing ideology, upholding a moral identity, or dehumanising the enemy. These include: metaphors and wording structure; use of praise; and coupling text and images.
F – raming and Feedback
Responding quickly to ‘frame’ and provide meaning to a situation is important in shaping how people subsequently think and feel about an event.
When events are uncertain or unfolding, ‘early movers’ can respond rapidly and frame the definition and meaning of complex social phenomenon. Messages are often framed in terms of potential ‘loss’ or ‘gain’ and ‘fear framing’ is a dominant technique used to scare people into changing what they think or how they act.
Alongside framing, ‘feedback’ mechanisms are built into many social media tools that can create an aura of consensus around a particular reaction or position. Margetts et al. (2016) found that ‘social communication’ mechanisms designed into many social media platforms have an especially pronounced influencing effect.
A - uthority or Automation
Authority bestowed onto a messenger or message by its recipient can increase or decrease its persuasive appeal depending on how far it is regarded as a trusted, legitimate source of information.
Public perceptions of authority may derive from power or expertise, or simply result from someone being in the ‘right place at the right time’. The latter represents a ‘democratising’ of authority online since anyone can potentially become a self-taught ‘expert’ of some authority.
Qualities associated with charismatic authority may be personally appealing to people (Hoffman, 2015), but authority may also be artificially constructed online because of bot activity boosting the frequency of a message and the likelihood of an individual encountering it.
Peoples’ pre-existing attitude to the authoritative source is important in shaping their willingness to engage with rumour, conspiracy theories or propaganda.
S - ystem I and II
Two levels of cognitive effort and deliberation apply to processing a message or communication. System 1 processing, the ‘heart’ of what we do, is our immediate, automatic and seemingly effortless response or opinion. System 2 on the other hand, is much slower, more deliberate, and rational contemplation of an argument.
Communications can target one or other system, for example, playing on our gut reactions or conveying a sense of urgency speaks to system 1. Tension between system 1 and 2 can cause mistakes.
Behavioural scientists have evidenced how our judgements and decision making commonly rely on a range of optimal ‘short-cuts’ in the context of limited time and other competing demands (Kahneman, 2011).
T – ension ( and Timing)
Soft facts can minimise tension or else maximise divisions between groups in ways that contribute to social unrest in the real-world.
Online data shows soft facts are most likely to spread online in the immediate aftermath of social crises or emergencies. Message timing is therefore critical for those seeking to influence online communities at crisis-points.
On a more personal level, messages purposively timed to coincide with a ‘pain point’ in a someone’s life, such as bereavement or debt, can have increased salience and resonance for them and be highly persuasive at a time when they might be seeking answers or solutions (O’Neill, 2016).
E - motion
Communication evoking a particular emotion in its audience can amplify its persuasive appeal.
Storytelling, for example, can paint a rich, engaging and meaningful picture. If successful in recalling a prior experience or memory, compassion or a personal meaning, the recipient is more likely to act on gut instinct because it ‘feels right.’ Other messages might try and provoke more negative feelings of fear, anxiety or a sense of urgency to encourage a quick, non-deliberative action.
Emotional impact cannot always be foreseen however, and making people feel angry can have unanticipated effects on what they do next.
R – eaction (Resonance and Reactance)
How people react to a communication can influence how far they are willing to change what they do, online and/or offline. Responses can be divided into two ‘R’s’: Resonance and Reactance.
Resonance accounts for how an emotionally-laden account can be constructed to chime with the message receiver, both psychologically and physiologically to forge a personal meaning (Schwartz, 1973).
Reactance is commonly understood as a motivational reaction to a message that includes feelings of anger or frustration. Particularly associated with dogmatic, forceful messages that make a person feel that their freedom is being limited, reactance may lead them to resist persuasion and the desired behaviour (Brehm, 1966).
Why does this matter?
Insights for crime, terrorism and social control
Soft facts can spread rapidly online, especially when subject to the kinds of digital influence engineering techniques outlined above. When this happens, they can induce or amplify other social problems including violence, tension and social division. As such, the information age is posing multiple challenges to policing and security agendas as more of social life, its associated behaviours, conflicts and misdemeanours move online, but retain offline consequences in the real world.
The rapid pace of change associated with communication platforms, the formation and dispersion of user communities and their capabilities requires a broadening of our perspective on social control. Enforcement and management efforts must increasingly be targeted at the online spread of information and how it impacts on public attitudes, values, beliefs and behaviours. Soft facts play a key role in shaping public sentiment online and collective action offline.
Importantly, the rapidly developing science of influence is generating new evidence and insights about how and why the communication of soft facts can alter the ways people think, feel or act following a major crime or terrorist incident.
Only by understanding online techniques of persuasion better can we formulate evidence-based counter-responses to disrupt, diffuse or resolve the often longstanding and pervasive messages that advocate how people understand the contemporary world around them.
References and further information
- Allcott, H. and Gentzkow, M. 2017. Social Media and Fake News in the 2016 Election, Journal of Economic Perspectives 31 (2), 211-236.
- Allport, G. W. and Postman, L. 1947. The Psychology of Rumor, New York: Henry Holt & Co.
- Bolsover, G and Howard, P. 2017. Computational propaganda and political big data: moving towards a more critical research agenda. Big Data, 5, 4.
- Brehm, J. 1966. A Theory of Psychological Reactance, New York: Academic Press.
- Halpern, S. 2017. How He Used Facebook to Win, The New York Review of Books, June 8.
- Hoffman, D. 2015. Quantifying and Qualifying Charisma: A theoretical Framework for Measuring the Presence of Charismatic Authority in Terrorist Groups, Studies in Conflict and Terrorism 38, 710-733.
- Jowett, G. S. and O’Donnell, V. 1999. Propaganda and Persuasion, London: Sage Publications.
- Kahneman, D. 2011. Thinking fast and slow, London: Penguin Books Ltd.
- Margetts, H., John, P., Hale, S. and Yasseri, T. (2016) Political Turbulence: How Social Media Shape Collective Action. Princeton University Press.
- Maynard, J. L. and Benesch, S. 2016. Dangerous Speech and Dangerous Ideology: An Integrated Model for Monitoring and Prevention, Genocide Studies and Prevention: An International Journal 9 (3), 70-95.
- McLuhan, M. 1964. Understanding Media: The Extensions of Man, New York: McGraw-Hill.
- O’Neill, C. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, US: Crown Publishing.
- Schwartz, T. 1973. The Responsive Chord. NY: Anchor Press.
- Shibutani, T. 1966. Improvised News: A Sociological Study of Rumor. Oxford: Bobbs-Merrill.
- Uscinski, J. and Parent, J. 2013. Why so many Americans believe Kennedy assassination conspiracy theories, The Washington Post.
- Vosoughi, S., Roy, D. and Aral, S. 2018. The spread of true and false news online. Science 359, 1146-51.
This work was funded by the Centre for Research and Evidence on Security Threats (CREST) as part of the ‘Soft Facts and Digital Behavioural Influencing’ project.
Download a pdf version of this briefing. (2.7MB)