8.3 C
New York
Thursday, March 28, 2024

Why social media may not be so good for democracy

 

Why social media may not be so good for democracy

Courtesy of Gordon HullUniversity of North Carolina – Charlotte

File 20171103 1008 1kvuik5.jpg?ixlib=rb 1.1

Some of the Facebook and Instagram ads used in 2016 election released by members of the U.S. House Intelligence committee. AP Photo/Jon Elswick

Recent revelations about how Russian agents inserted ads on Facebook, in an attempt to influence the 2016 election, present a troubling question: Is Facebook bad for democracy?

As a scholar of the social and political implications of technology, I believe that the problem is not about Facebook alone, but much larger: Social media is actively undermining some of the social conditions that have historically made democratic nation states possible.

I understand that’s a huge claim, and I don’t expect anyone to believe it right away. But, considering that nearly half of all eligible voters received Russian-sponsored fake news on Facebook, it’s an argument that needs to be on the table.

How we create a shared reality

Let’s start with two concepts: an “imagined community” and a “filter bubble.”

The late political scientist Benedict Anderson famously argued that the modern nation-state is best understood as an “imagined community” partly enabled by the rise of mass media such as newspapers. What Anderson meant is that the sense of cohesion that citizens of modern nations felt with one another – the degree to which they could be considered part of a national community – was one that was both artificial and facilitated by mass media.

Mass media is one way to create a shared community. Dave Crosby, CC BY-SA

Of course there are many things that enable nation-states like the U.S. to hold together. We all learn (more or less) the same national history in school, for example. Still, the average lobster fisherman in Maine, for example, doesn’t actually have that much in common with the average schoolteacher in South Dakota. But, the mass media contribute toward helping them view themselves as part of something larger: that is, the “nation.”

Democratic polities depend on this shared sense of commonality. It enables what we call “national” policies – an idea that citizens see their interests aligned on some issues. Legal scholar Cass Sunstein explains this idea by taking us back to the time when there were only three broadcast news outlets and they all said more or less the same thing. As Sunstein says, we have historically depended on these “general interest intermediaries” to frame and articulate our sense of shared reality.

Filter bubbles

The term “filter bubble” emerged in a 2010 book by activist Eli Pariser to characterize an internet phenomenon.

Legal scholar Lawrence Lessig and Sunstein too had identified this phenomenon of group isolation on the internet in the late 1990s. Inside a filter bubble, individuals basically receive only the kinds of information that they have either preselected, or, more ominously, that third parties have decided they want to hear.

The targeted advertising behind Facebook’s newsfeed helps to create such filter bubbles. Advertising on Facebook works by determining its user’s interests, based on data it collects from their browsing, likes and so on. This is a very sophisticated operation.

Facebook does not disclose its own algorithms. However, research led by psychologist and data scientist at Stanford University Michael Kosinski demonstrated that automated analysis of people’s Facebook likes was able to identify their demographic information and basic political beliefs. Such targeting can also apparently be extremely precise. There is evidence, for example, that anti-Clinton ads from Russia were able to micro-target specific voters in Michigan.

Is Facebook creating filter bubbles? sitthiphong/Shutterstock.com

The problem is that inside a filter bubble, you never receive any news that you do not agree with. This poses two problems: First, there is never any independent verification of that news. Individuals who want independent confirmation will have to actively seek it out.

Second, psychologists have known for a long time about “confirmation bias,” the tendency of people to seek out only information they agree with. Confirmation bias also limits people’s ability to question information that confirms or upholds their beliefs.

Not only that, research at Yale University’s Cultural Cognition Project strongly suggests that people are inclined to interpret new evidence in light of beliefs associated with their social groups. This can tend to polarize those groups.

All of this means that if you are inclined to dislike President Donald Trump, any negative information on him is likely to further strengthen that belief. Conversely, you are likely to discredit or ignore pro-Trump information.

It is this pair of features of filter bubbles – preselection and confirmation bias – that fake news exploits with precision.

Creating polarized groups?

These features are also hardwired into the business model of social media like Facebook, which is predicated precisely on the idea that one can create a group of “friends” with whom one shares information. This group is largely insular, separated from other groups.

The software very carefully curates the transfer of information across these social networks and tries very hard to be the primary portal through which its users – about 2 billion of them – access the internet.

Facebook depends on advertising for its revenue, and that advertising can be readily exploited: A recent ProPublica investigation shows how easy it was to target Facebook ads to “Jew Haters.” More generally, the site also wants to keep users online, and it knows that it is able to manipulate the emotions of its users – who are happiest when they see things they agree with.

Is social media creating more polarization? Chinnapong/ Shutterstock.com

As the Washington Post documents, it is precisely these features that were exploited by Russian ads. As a writer at Wired observed in an ominously prescient commentary immediately after the election, he never saw a pro-Trump post that had been shared over 1.5 million times – and neither did any of his liberal friends. They saw only liberal-leaning news on their social media feeds.

In this environment, a recent Pew Research Center survey should not come as a surprise. The survey shows that the American electorate is both deeply divided on partisan grounds, even on fundamental political issues, and is becoming more so.

All of this combines to mean that the world of social media tends to create small, deeply polarized groups of individuals who will tend to believe everything they hear, no matter how divorced from reality. The filter bubble sets us up to be vulnerable to polarizing fake news and to become more insular.

The end of the imagined community?

At this point, two-thirds of Americans get at least some of their news from social media outlets. This means that two-thirds of Americans get at least some of their news from highly curated and personalized black-box algorithms.

Facebook remains, by a significant margin, the most prevalent source of fake news. Not unlike forced, false confessions of witchcraft in the Middle Ages, these stories get repeated often enough that they could appear legitimate.

What we are witnessing, in other words, is the potential collapse of a significant part of the imagined community that is the American polity. Although the U.S. is also divided demographically and there are sharp demographic differences between regions within the country, partisan differences are dwarfing other divisions in society.

This is a recent trend: In the mid-1990s, partisan divisions were similar in size to demographic divisions. For example, then and now, women and men would be about the same modest distance apart on political questions, such as whether government should do more to help the poor. In the 1990s, this was also true for Democrats and Republicans. In other words, partisan divisions were no better than demographic factors at predicting people’s political views. Today, if you want to know someone’s political views, you would first want to find out their partisan affiliation.

The reality of social media

Jason Howie, CC BY

To be sure, it would be overly simplistic to lay all of this at the feet of social media. Certainly the structure of the American political system, which tends to polarize the political parties in primary elections, plays a major role. And it is true that plenty of us also still get news from other sources, outside of our Facebook filter bubbles.

But, I would argue that Facebook and social media offer an additional layer: Not only do they tend to create filter bubbles on their own, they offer a rich environment for those who want to increase polarization to do so.

The ConversationCommunities share and create social realities. In its current role, social media risks abetting a social reality where differing groups could disagree not only about what to do, but about what reality is.

Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina – Charlotte

This article was originally published on The Conversation. Read the original article.

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Stay Connected

157,453FansLike
396,312FollowersFollow
2,280SubscribersSubscribe

Latest Articles

0
Would love your thoughts, please comment.x
()
x