T.J. Coles, postdoctoral researcher at Plymouth University’s Cognition Institute and author of a new book The War On You, reflects on the mobilisation of big data, propaganda and online chicanery in the run-up to November’s US presidential election.
President Donald J. Trump is threatening to stay in office if he loses the US election in November. Analysts expect that the controversy will spill over into January 2021 and be fought in the Supreme Court, where the Republican majority will rule in Trump’s favour and allow him to be sworn in for a second, undeserved term. Stacking the judiciary and refusing to accept election results is straight out of the autocrat playbook.
The Transition Integrity Project (TIP) consists of establishment figures on the so-called left and right, including warmongers like former Bill Clinton’s Chief of Staff, John Podesta, and the George W. Bush-era neocon, William Kristol. But even TIP has been wargaming scenarios in case Trump declines to transition peacefully to the Democrats, should he lose. Meanwhile, retired military officers have reminded the Joint Chiefs of Staff of their Constitutional duty to forcibly remove Trump if he clearly loses but refuses to leave.
Trump-supporting militias and paramilitaries, many of which include far-right elements like the Proud Boys and the Boogaloo Boys, are preparing for civil war against Black Lives Matter and Antifa, which they falsely see as Democrat proxies. Arguably, the paramilitaries are being motivated by many sources, including Trump’s former advisor and War Room host, Stephen K. Bannon, who is telling listeners that Trump’s COVID is a plot by the Chinese Communist Party. Alex Jones of InfoWars had previously called for civil war and is now whipping up hysteria by claiming that the Democrat-supporting Deep State may have poisoned the President.
In the widely-panned “debate” with his challenger, Joe Biden, Trump told his armed supporters to ‘stand by.’
It is not as if fake news alone created Trump and the current controversies, but it certainly helps Trump and fuels division. In my new book, The War on You, I document how big business and government use information, money, law, and psychology to influence your behaviour in their own interests. What follows is an excerpt on fake news and how bots can radicalize enough swing voters to influence elections in ways desired by right-wing factions.
Bots are getting so sophisticated that it is becoming increasingly difficult to tell which online comments are real and which are algorithms, and which sites are genuinely popular and which get fake hits generated by bots. Net-based ‘fake news’ has become a phenomenon. Bots are pushing fake stories to make them go viral by sharing stories among fake bot accounts (‘sock puppets’) on social media.
So, if few real people are reading the stories, what’s the harm? Well…
In 2011, a team at Texas A&M University created gibberish-spewing Twitter accounts. Their nonsense could not have possibly interested anyone, yet soon they had thousands of followers. They found that their Twitter ‘followers’ were bots. In 2017 under a Pentagon grant, researchers Shao and the team analysed 14 million Tweets spreading 4,000 political messages during the 2016 US presidential campaign. They found that ‘[a]ccounts that actively spread misinformation are significantly more likely to be bots’.
Fake news, they say, includes ‘hoaxes, rumors, conspiracy theories, fabricated reports, click-bait headlines, and even satire’. Incentives include sending ‘traffic to fake news sites [which] is easily monetized through ads, but political motives can be equally or more powerful’. During the presidential campaign, it was discovered that popularity profiles of fake news are indistinguishable from fact-checking articles. The authors note that, ‘for the most viral claims, much of the spreading activity originates from a small portion of accounts’. The so-called super-spreaders of fake news are likely to be ‘social bots that automatically post links to articles, retweet other accounts, or perform more sophisticated autonomous tasks’. Regional vote shares toward then-presidential candidate Donald Trump did not match the geographical location of (likely) bot accounts. Though it is unconfirmed, it is likely ‘that states most actively targeted by misinformation-spreading bots tended to have more surprising election results’.
Other researchers, Ratkiewicz and the team, argue that Twitter has a structural bias for fake news due to its then-140-character sound bites, which ‘are ready-made headline fodder for the 24-hour news cycle’, while Ferrara et al. write: bots can ‘engage in … complex types of interactions, such as entertaining conversation with other people, commenting on their posts, and answering their questions’. The authors go on to note that bots ‘can search the Web for information and media to fill their profiles, and post collected material at predetermined times, emulating the human temporal signature of content production and consumption’, including the time of day when bot activity spikes.
So, how does this influence public opinion?
In 2012, scientists working for the Center for Tobacco Control Research and Education at the University of California and San Francisco exploited nearly 700,000 Facebook users by making them participate in an experiment without their knowledge or consent. ‘The experiment manipulated the extent to which people … were exposed to emotional expressions in their News Feed’, says the research paper. The experiment ‘tested whether exposure to emotions led people to change their own posting behaviors’.
The two parallel experiments involved 1) reducing friends’ exposure to positive content and 2) reducing their exposure to negative content. ‘[F]or a person for whom 10% of posts containing positive content were omitted, an appropriate control would withhold 10% of 46.8% (i.e., 4.68%) of posts at random, compared with omitting only 2.24% of the News Feed in the negatively-reduced control’. The authors go on: ‘As a secondary measure, we tested for cross-emotional contagion in which the opposite emotion should be inversely affected: People in the positivity reduced condition should express increased negativity, whereas people in the negativity reduced condition should express increased positivity’.
The results concerning this so-called emotional contagion were statistically miniscule: 0.001. But, as the authors point out, given ‘the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences’. This, they theorize, equates ‘to hundreds of thousands of emotion expressions in status updates per day’. Data players like hedge fund and private equity firm friend, Dominic Cummings, who famously helped to engineer Brexit and is now advising—some say pulling the strings of—Boris Johnson, understand that elections and referenda can be won by using targeted propaganda to radicalize a small number of previously indifferent people, just large enough to have electoral impact.
Promoted by influential Trump supporters, QAnon is a theory that the President is secretly destroying elite child abuse rings. But QAnon could be a cynical voter mobilization tool. The Trump propaganda campaign gets potential voters to care about the Deep State, Democratic Party corruption, and supposed Chinese global domination. But right-leaning non-voters who don’t care about these things are being mobilized to vote for Trump via QAnon promoters, who stoke potential voters’ sense of moral outrage.
This goes to show that big data, hidden agendas and fake news can be weaponized by wealthy elites to motivate hitherto politically indifferent people to back causes that are ultimately against their and everyone else’s interests.
Image by Lorie Shaull, re-used under a Creative Commons Attribution-Share Alike 2.0 Generic licence, via Wikimedia Commons.