Jennifer Scott
Jennifer Scott

One decisive consequence of the Jan. 6 storming of the US Capitol by Trump supporters was the permanent suspension of the @realDonaldTrump account on Twitter “due to the risk of further incitement of violence” (Twitter Blog, Jan. 8, 2021), as well as the temporary suspension of his social media accounts on Facebook, Instagram and YouTube among others.

Communications professionals know the power of words, especially words amplified by influential people on platforms with massive reach. We understand that checks and balances are at the heart of deploying words in responsible and accountable ways. We have our own code of ethics. We work with journalists who have their own code of ethics. Common among them is a commitment to honesty, accuracy, and truth.

Despite this ethical foundation, we have joined the rest of the world in scrambling to respond ethically to the meteoric rise of social media as a communications platform. In 15 short years, platforms like Twitter and Facebook have transformed from vehicles used by friends and family to keep in touch, to algorithmically driven marketing machines that harvest massive amounts of personal information and monetize engagement. They have profoundly impacted social discourse and have contributed to divisiveness and the propagation of misinformation.

A 2018 MIT study of Twitter (MIT News, March 8th, 2018) found that false stories are 70 percent more likely to be retweeted (by people, not bots) than true stories, and that it also takes true stories six times longer than false stories to reach the same number of people. Furthermore, out of all news categories, political falsehoods traveled faster and further. The research indicated that this virality of misinformation might be because humans respond to false news with strong emotions, like surprise and disgust, and this emotional “novelty” might drive accelerated sharing.

Social media platforms benefit financially from the kind of information that generates maximum engagement. The MIT study proves that false stories are significantly better at driving engagement than true stories. Misinformation can be monetized far more readily than truth.

On social media platforms, the virality of misinformation is also enhanced by the selective targeting of consumers using algorithms. These function to feed the most “relevant” content (including PR and advertising) to users based on their interests (harvested from their search history). This is one thing if I am searching for a new vacuum cleaner and targeted ads appear on my feed. But it’s another when the content being pushed to me includes misinformation that has been selected to appeal to my political and ideological convictions.

Furthermore, as professional communicators know, confirmation bias is a powerful driver of selective attention. We have a strong preference to embrace content that reinforces our existing views, and to reject information that counters them. The algorithmic targeting of audiences with their preferred content on social media platforms reinforces confirmation bias.

It is becoming clear how human nature can combine with the business model of social media, to allow users to inhabit increasingly partisan online worlds, reinforced by misinformation, where views are radicalized rather than tempered. This is an environment ripe for those who would sow discord and extremism. Which is why suspending the president from social media was like extinguishing a match without doing much to drain away the giant pool of gasoline.

It’s up to us as communicators to recognize this situation for what it is and to reassess how we are using social media on behalf of our organizations and clients. We need to look carefully at the ways in which personal data (especially psychographics) are being harvested and used for selective targeting of advertising or PR. We need to take a fresh look at the stories we tell to check that we are not inadvertently pandering to ideological confirmation bias. And we need to more diligently examine situations where we might be tempted to prioritize “reach” over accuracy or “engagement” over-balanced discourse. In this way, we can do our part to dilute that pool of gasoline and restore social media as a vibrant, diverse, and healthy communications environment.

****

Jennifer Scott is clinical assistant professor, PR and corporate communications at New York University's School of Professional Studies. She was previously managing director, thought leadership at Ogilvy.