Fake News for the Masses
Dr. David Levin, a lecturer at the School of Communications at the College of Management of the Open University, agrees. We accept a press whose basis is very “Tel Avivish,” that is more or less in the opposition, and when that’s reflected on Facebook, we’re sure it’s reality, it’s the truth,” he says. “But we do not see and can not see many things, for example the zeal and motivation of right-wingers to go vote on election day.” A similar situation occurred in the last presidential elections in the United States, but that was where false news came into the picture: among the many reports and publications regarding the candidates in the race for the White House, which spread quickly on social media networks, there were also quite a few false news items.
The day following President Donald Trump’s election, the anger at the fictional reports was clearly directed at those who gave them a platform – Google and Facebook – claiming that they influenced the election results. It was Facebook, more than Google, which attracted the fire and was accused by some of the political commentators in the United States of nominating votes for Trump as a result of false publication. For example, it was reported that the pope expressed support for Trump, and that the FBI agent who investigated the Hillary Clinton affair, murdered his wife and committed suicide. It was also reported that Barack Obama and Clinton intend to act leniently toward illegal immigrants who vote for Democrats.
Although the responsible news bodies strongly denied the reports, commentators claimed that these stories had already shaped public opinion before the elections – especially in light of the fact that 44% of adults in the United States are exposed to the media via Facebook, and many consume news from a small number of sources, all with an agenda they agree with. Mark Zuckerberg, founder and CEO of Facebook, did not ignore this criticism.
Although he claimed that more than 99% of the information people see on the social network is true, and rejected the idea that false information influenced voters, he still announced this January that the popular news topics that Facebook has been introducing to its users will stop being personally matched to fit users.
“Facebook takes this issue very seriously, at least on the informational level. Zuckerberg published a manifesto in which he spoke of how important it is for Internet surfers to be exposed to diverse opinions, even those they may not agree with. He believes that Facebook should act on this. The question is, what will they actually do about it,” says Dr. John.
Why Were We Surprised When Trump Was Elected?
According to Dr. Levin, personalized news has existed in the form of one version or another since the early days of the Internet. “You know in advance that you aren’t buying the whole cow, but only a glass of milk,” he says. “Unlike a print newspaper, you have the possibility of selectivity within digital newspapers. What’s more, Facebook initiates active action – inserting into our Newsfeed only articles that seem interesting to us. There’s this interference of an external agenda in what we see, and this is an intensification of what is happening on the regular sites.”
He explains that today, people are much more aware of bias in the traditional media and are also much more suspicious of the media. “People read today with much more criticism, but Facebook is different – it’s difficult for us to identify ‘fake news’ because through Facebook we read content that looks exactly the same as what we’re used to. And if they are identical in content to what we consume, it’s all good.”
In addition, Dr. Levin explains, “When you read a newspaper, there is preoccupation around it. It becomes an activity often accompanied by a sort of ceremony. We devote time to this and even sometimes read an article contrary to our own beliefs. On the other hand, surfing on Facebook is almost like flipping through channels on TV – we don’t devote much time to what we’re reading, and we do this all day, flipping through Facebook using our mobile phone as the remote. It’s much easier, then, to block opinions we don’t agree with and hide behind our own keyboards.”
The result: the creation of a number of small virtual communities in which its members share similar opinions and identical world-views. Dr. Levin calls them “Small Spaces.” This refers to short talk spurts that have no place in traditional media,” he explains. “Anyone who takes part in these small spaces and receives only what he received from his friends on Facebook is certain that this is reality, and then he becomes confused. The newspapers in traditional media present a certain situation which is appropriate for people who think the same way they do, but they are not away of the ‘small spaces’ in which the media is deciphered in an opposing manner.”
“The question is: who was surprised? Trump’s supporters are ‘Archie Bunkers’ – the American middle class, the manual workers who feel that the government treats everyone well except them. They are hardly represented in traditional media, and when they are, it is often done so in a negative way. Research shows that they probably have their own digital places where they accumulate power and legitimacy and are also assisted by Facebook. This doesn’t happen because of Facebook, but Facebook does give them a platform and new content that encourages their own perception and creates a discourse that we, in the traditional media, are not exposed to.
But the greatest danger of this conceptual closure is not necessarily the surprise that may occur on the day of counting votes after elections. The real danger of the “echo chambers” effect is anti-democratic: deep social polarization and gaps that will grow among different populations. When we, with or without help from the Facebook algorithm, choose to filter out the messages we receive from our Facebook friends, we encourage situations of social radicalization – extreme rightists, extreme leftists, extreme vegans, extreme pacifists, and all the rest – all the result of a divided social network that provides everything with the most comfortable and suitable living and thinking environment, to the point of disgust and fear of the other. Extremists have always existed, but when we reinforce our positions among a large group of people, we are strengthening these positions.
“The sociological implications of small and closed Facebook groups are definitely extremism and polarization,” says Dr. Omri Herzog, Head of the Culture, Creation, and Production Department at Sapir College. “If there is a collective agreement on a certain opinion, it will always be extreme. If I’m a fan of a soccer team and am at the game with all of the other fans, I will intensify my admiration. In this way, not only am I not exposed to other positions, I also do not perceive that any other positions are legitimate. That’s why we see so much verbal violence on Facebook”.
“Many talk about Facebook in the context of multiculturalism, ostensibly the most democratic open arena there is. But in reality, Facebook encourages polarization – with the option of hiding, deleting, and blocking other members, and of course its filtering algorithm. In fact, there is no exchange of different ideas, which is Facebook’s potential, and we react to one another with hostility and a sense of threat when we are exposed to another person’s opinion. Many times, when someone seeks multiculturalism, this happens instead – different communities that dislike one another. A culture of hatred and threat develops.”
What is a Friend
Hertzog claims that while it is impossible to create a link between the two, the development of close-minded thoughts on Facebook makes us witnesses of different signs in our culture, of violent behavior towards opinions different than ours. We encounter this in demonstrations, for example, which have become more and more violent in political discussions and even in the Knesset. “The discourse has become more heated and does not really exist between people from different areas,” he says. “When I meet face-to-face with someone different from me, I can’t act violently towards them. I can’t ‘unfriend’ him and make them disappear – and in the army or at school we meet people different from us. Facebook has changed the rules of the game. We’ve been in front of our computer and mobile phones for so long that Facebook has become a very central emotional prism in our day-to-day lives. Facebook changed the definition of the world friend, and also the way we react to opinions different from ours.
The radicalization, Herzog stresses, is not necessarily the result of opinions, but of the show of opinions: “I have to put on a show in order to get more likes, and the standard always gets higher and higher. So there’s much less patience and tolerance towards opinions different from ours, but regarding the question of how the online world affects us in the real world there are still no answers. Although the subject has been investigation, there are different and contradictory findings.
If, for example, I watch violent content, will I be violent in real life? Some will say that there is a very clear buffer between the two, because the online world is actually who we are, and we can do whatever we want online and expose our true passions, but we wouldn’t actually go out into the real world and act on this. And there are others that would say that of course this is a direct relation between the two worlds. Does Facebook actually bring human nature to light, where in real life this human nature is censored and has limits, or is Facebook creating its own dynamic, arousing in people feelings of jealousy, hatred, and fear because it promises no consequences? It’s difficult to know what came first.”