Tweeting the Terror
22 May 2014
Study finds Twitter love for Lee Rigby stronger than hate for his killers
On the anniversary of the murder of Lee Rigby, researchers at Cardiff University have published results of their analysis into social media activity in the immediate aftermath of the event and found that messages loaded with racial tension and hate were far less likely to spread than those infused with love.
Carried out by academics at Cardiff University's Collaborative Online Social Media Observatory (COSMOS), this is the first analysis of its kind that examines the social media reaction following a terrorist attack. The team collected around half a million tweets to statistically model how the public reacted.
This terrorist attack was the first in the UK to foster a significant social media reaction. In less than 20 minutes of the incident being reported to the police, eye-witnesses were using Twitter to spread information about the event as it unfolded. Disturbing images and video clips emerged online shortly after. These snippets of information were rapidly diffused through the social media eco-system by the act of 'retweeting'.
The researchers were particularity interested in identifying the tweets that were most likely to spread following this event. They focussed on the emotive content of messages such as negative and positive sentiment, and racial tension, as well as content linking features within messages, such as hashtags and URLs.
They also examined whether the speed at which tweets were re-tweeted affected the eventual number of retweets and if Twitter users with more followers gained more influence in the spread of messages.
The results were surprising and did not support the common conception that social media platforms are havens for those spreading hateful and social disruptive online content. To the contrary, the COSMOS team found that messages containing positive sentiment, such as tweets of good-wishes to the family of Lee Rigby, were statistically more likely to be frequently retweeted and form large and long lasting information flows. They also found that tweets that included high levels of racial tension, such as those spreading hateful content towards those of Muslim faith, were statistically less likely to be retweeted.
Dr Pete Burnap, School of Computer Science and Informatics, commented:
"Social media has often been associated with the spread of malicious and antagonistic content that could pose a potential risk to community relations. We frequently hear about trolling and social media being used to harass members of the public or certain groups in society. However, this research provides some evidence that suggests it is actually the more positive and supportive messages that spread to a significant extent following events of this nature."
These findings are the first to indicate that social media platforms, in particular Twitter, may self-regulate, stemming the flow of negative and hateful information following terrorist and similar events of national interest. The next phase of the research for the COSMOS team is to investigate if and how social media users engage in counter speech, to stem the spread of negativity online.
Dr Matthew Williams, School of Social Sciences, said:
"Social scientists at Cardiff University have been conducting research into how people behave online for more than three decades. Some of this work on virtual communities has shown how self-regulation, or what criminologists have called responsibilization, is evident online. It seems plausible that this pattern of behaviour is present in social media networks."
The COSMOS team now plan to apply the same statistical model to several more events, including the Boston bombings, the coming out of Olympian Tom Daley on YouTube, the Paralympic opening ceremony and the online harassment of Caroline Criado-Perez.
The paper is entitled 'Tweeting the Terror: Modelling the Social Media Reaction to the Woolwich Terrorist Attack' and will be published in the international peer-reviewed journal Social Network Analysis and Mining in June 2014.
The work is being conducted as part of the COSMOS Economic and Social Research Council (ESRC) Google Data Analytics grant 'Hate' Speech and Social Media: Understanding Users, Networks and Information Flows'.
In addition to Dr Pete Burnap and Dr Matthew Williams this project involves Professor William Housley, Dr Adam Edwards and Dr Luke Sloan from the School of Social Sciences; Professor Omer Rana from the School of Computer Science and Informatics; Professor Rob Procter from University of Warwick and Dr Alex Voss from University of St Andrews.
In the Media
Since releasing this story, it has been picked up in local and international press: