The struggle between truth and lies on Twitter was evenly matched as the U.S. election campaign drew to a close, according to a major study on “computational propaganda”.
The Oxford Internet Institute analysis [in PDF form for download] showed that in the last 10-days of the campaign misinformation on Twitter matched real information 1:1. It’s a significant piece of data to fuel the debate over the so-called weaponisation of news in the 2016 election race and the role played by platforms such as Twitter and Facebook.
“We have a new form of propaganda which we call computational propaganda. It is faster, more targeted and it’s personal,” Lisa-Maria Neudert, a researcher on the project told MisinfoCon — a London event supported by the Mozilla Foundation which supports the Firefox browser and other open-information initiatives.
U.S. Twitter users were sharing links to information and misinformation in equal amounts, the report found.
Neudert said the investigation showed propaganda is more targeted than ever before. It is not only created by humans, but by bots.
Executives from Twitter, Facebook and Google are due to appear before a joint meeting of the U.S. Congress Senate and House this week to face questions over the extent to which Russia may have used their services to influence the election with the targeted distribution of misinformation.
That “weaponisation of information” is a known part of a Russian strategy of “hybrid war”. Information scientist Rand Waltzman of the RAND Corporation, published a study this year noting that chief of the Russian General Staff, General Valery Gerasimov, “observed that war is now conducted by a roughly 4:1 ratio of nonmilitary and military measures,” including information operations. In his study Waltzman said “Russia considers itself in a perpetual state of information warfare, while the West does not.”
Neudert told MisinfoCon that the Oxford data showed that the amount of misinformation shared in the U.S. election was significantly higher than in recent French, UK and German elections. In these last three elections the ratio of information to misinformation was 7:1, 4:1 and 4:1 respectively compared with the U.S. at one-to-one.
— Burt Herman (@burtherman) October 25, 2017
In other words, according to the Institute (a department of Oxford University) half of news being shared on Twitter in the U.S., prior to the election was misinformation. This was based on over one million tweets from users that had provided location information confirming they were in the U.S. and shared links containing a URL.
In comparison, the amount of misinformation being shared on Twitter only made up 12.5 percent of the French and 20 percent of recent UK and German elections, she said.
Neudert said she suspected that Americans were more susceptible to sharing misinformation because they were seeking news from sources other than the mainstream media which many deeply mistrusted. That was particularly true in highly contested “swing” states.
“Average levels of misinformation were higher in swing states than in uncontested states,” the report Social Media, News and Political Information during the US Election: Was Polarizing Content Concentrated in Swing States? said. “The proportion of professional news content being shared hit its lowest point the day before the election.”
Of news being shared on Twitter in the 10 days leading up the 2016 U.S. election, only 20 percent of links shared with election-related hashtags came from professional news organizations and links to content produced by government agencies, experts or political parties and candidates only made up 10 percent of news shared.
The number of links to ‘polarizing and conspiracy content’ – which consisted of Russian news stories, junk news and unverified or irrelevant links to WikiLeaks pages – was greater than the number of links to professionally published news, the report showed. West Virginia received the highest level of ‘polarizing and conspiracy content’, while North Dakota received the lowest.