Techcrunch Disrupt SF 2013. Photo by Max Morse for Techrunch.

Facebook Encourages Radicalization and Polarization

This is not what Mark Zuckerberg intended: Instead of connecting people around the world, Facebook has become a source for polarization and angst, where people come to hear ideas similar to their own and always left frustrated
Mark Zuckerberg. Not a media company – so no need for ethics?                                                                                                                                               Photo by Max Morse for Techrunch

 

Ever since the Internet entered our lives, it has fulfilled the vision of the global village. When Facebook, the largest social media network in the world, was established in 2006, it looked as if this vision was taking itself one step further: not only did it give people the possibility to connect to information from all over the world, it also brought with it the opportunity to communicate with different people from around the globe. For many, Facebook was seen as a promoter of pluralism, as someone that puts the spotlight on an array of diverse ideas, even those less popular. But much has changed since then, and a decade later, Facebook and pluralism are somewhat of an oxymoron.

In June 2016, three American researchers confirmed what we already knew about social media, or at least what we suspected. In their article, quantitative evidence was found proving that social network users tend to promote their own beliefs and narratives, and create polarized groups that resist information incompatible with their own worldviews. The study found that users belonging to different communities tended not to interact with one another and instead remained connected only to friends with a similar view as their own.

In this way, closed communities centered on different narratives formed what the researchers called an “echo chamber” – and in these communities, users held similar views. There was no discussion or dialogue between members of the different communities.

Enjoy Your Peaceful Echo-Chamber

According to the study, Internet users tended to search for information that strengthened their own personal views and rejected opposing information. When false information is deliberately inserted into these “echo chambers,” it is absorbed and appears reliable as long as it fits with the initial narrative. If you translate this into everyday Internet surfing on Facebook, it basically means that we hide and even remove from our friends list anyone that has different views than we do. On paper we seem enlightened and liberal – some of our Facebook friends are on the opposite side of the political map, we have friends from all over the world and are open to hearing a variety of opinions – but only so long as their opinions are similar to ours.

During difficult times, the issue was particularly striking – Operation Tzuk Eitan, for example, proved that another war was taking place in the social networking arena. A study by Dr. Nicholas John, a lecturer in the Department of Communication and Journalism at the Hebrew University of Jerusalem, examined the characteristics of Facebook’s use of about 1,000 Jewish users during the war and found that one in six people removed a follow or “unfriended” a Facebook friend. 60% did so following the publication of content they did not agree with, and 52% did so because they encountered posts they defined as “offensive.” It was also found that those who held more extreme political views – left and right – tended to block or delete more friends than those with more neutral political views.

“Facebook gives us the ability mainly to forget that there are people who think differently than we do, and when already know that there are those who think differently from us – it does not really change our opinion,” explains Dr. John. “People marry other people that are similar to them, connect with people similar to them. This is expressed also on Facebook. While it is possible to understand how other cultures are run and how other people think through the Internet and Facebook, the user decides on his own if others are acting and thinking in a way he deems right. I can be exposed to someone else, but that doesn’t mean I’ll accept him more openly.” In other words, theoretically, we can share posts with a user from Afghanistan, but in practice, we don’t.

“The belief that technology has the potential to bring people closer together is an illusion,” John continues. “That’s what they said about the telegraph in the 19th century, that people could better understand one another, and then wars would no longer exist. They said it about the television, too – that children would be exposed to diverse characters and will in turn become more patient. In practice, it just doesn’t work that way. If we do expose ourselves to opinions differing from our own, we’re doing so only to strengthen them. That’s exactly the reason why a left-wing Internet user goes into right-wing rapper, ‘The Shadow’ Facebook page – to strengthen his own opinions, and that’s also why a right-wing person will surf the “Haaretz” website.”

Newsletter Subscription

More Articles

Newsletter Subscription

Sign up for a free newsletter and enjoy regular updates, news, alerts and everything you must not miss.

Skip to content