How astroturfing creates the illusion of authenticity and faux consensus.

By Brennen Mahon
10/07/2020 • 01:29 AM EST


When judging media's authenticity, we tend to trust organizations that appear legitimately committed to their message. This however may have little bearing on their relationship with the truth.[1] In fact, when political systems suffer a crisis of legitimacy, flagrant liars and norm-violators are often perceived as more authentic than truth-tellers,[2] which matters because the presumption of authenticity ultimately helps determine the media we come to trust. 

Social media is the perfect environment to manufacture this illusion of authenticity, using a technique known as astroturfingsee definition - the practice of creating an illusion of widespread grassroots support for a policy or an individual where little such support exists.
. Astroturfing campaigns are only effective when their pages appear authentic, so astroturfers establish credibility with their target audience by recognizing their cultural experiences.[2] These pages groom their audience to later exploit their trust by inserting polarizing messages and disinformation into their follower’s timelines.


Russian controlled Twitter accounts: @BlackNewsOutlet and @USA_Gunslinger.

And once trust is established, it's easer to spread disinformation. During the 2018 U.S. midterm elections, Iranian and Russian astroturfing campaigns used Facebook to sow confusion and promote hyperpartisan narratives.[6] With social media, astroturfers are able to use bots and fake accounts to muddy the waterssee definition - bringing up irrelevant facts to confuse or complicate an issue, which may otherwise be relatively simple and easy to understand.
on a massive scale. These accounts transverse social media, spreading false or misleading information to quickly complicate seemingly simple issues, looking and acting like passionate users expressing genuine opinions.[7]   

With social media, astroturfers are able to manufacture Facebook pages designed to look like they organically sprung up from real individuals dedicated to a cause. In 2016, the St. Petersburg-based Internet Research Agency (IRA) sowed further discord through #BlackLivesMatter astroturfing accounts that tweeted hateful anti-police messages.[8] Russian controlled “right-wing” accounts followed suit by labeling the BLM movement as communist and anti-American to further polarize either side on the issue.[8] Our receptiveness to authenticity, and the social literacy of Russian troll factories, work together to shape public opinion and fuel further polarization and political extremism.


Comment posts are another way to create an illusion of authenticity and consensus. This post originated from a group using 27 unique IP addresses, all located in St. Petersburg, Russia. They work in tandem to post and boost their posts (notice the number of likes).


Astroturfing is also rooted in bandwagoningsee definition - creating social pressure to conform by promoting a sense of inevitable victory.
, where an illusion of consensus is manufactured to promote conformity with what is believed to be popular opinion. During the 2016 election, Russian bot accounts, posing as midwestern swing-voters, targeted other swing-voters with fake news spammed across Instagram.[9] Coming from like-minded individuals, these posts appeared as public opinion and were shared across social media, despite their manufactured origin. Bots were also used to buy likes, further propelling inauthentic content into public view and creating social pressure to conform to a falsely manufactured consensus.

In the digital era, propaganda is more influential, more participatory, and more difficult to recognize. Manufactured authenticity exploits perceptions of trust in online communities to promote narratives and, in some cases, to normalize views that people would otherwise oppose. Until platforms are willing to actively debate the role such content has in public discourse, astroturfing campaigns will remain a vital weapon used to shape public opinion.

References
1. George E. Newman, The Psychology of Authenticity. Review of Psychology. Published: 2009.
2. O. Hahl, M. Kim, and E.W. Zuckerman, The Authentic Appeal of the Lying Demagogue: Proclaiming the Deeper Truth about Political Illegitimacy. American Sociological Review. Published: 2018.
5. "Exposing Russia's Effort to Sow Discord Online: The Internet Research Agency and Advertisements". U.S. House of Representatives Permanent Select Committee on Intelligence. Published: February 16, 2018.

6. "Facebook Identifies New Influence Operations Spanning Globe". The New York Times. Published: August 21, 2018.

7. "Cyborgs, Trolls and Bots: A Guide to Online Misinformation". Voice of America. Published: February 08, 2020.

8. A. Arif, L.G. Stewart, and K. Starbird, Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse. Proceedings of the ACM on Human-Computer Interaction. Published: 2018.