Filter Bubbles in the Information Society

How is life in your Facebook bubble? In mine, people travel a lot and have international friends, the world is a big community; most people possess university education; and most share liberal values of multiculturalism and social equality. My bubble provides me with a variety of quality information because I ‘liked’ different pages (e.g. The Guardian, BBC, Al Jazeera, La Repubblica, The Moscow Times), and issues such as climate change are highly prominent (I follow several environmental NGOs). My Facebook bubble works for me as a sort of international daily magazine. Then of course there are targeted ads that are aligned to my preferences (or in most cases). I detest news about creepy crimes, and magically there is no trace of such news in my bubble: the world is a nicer place, free from murders, serial killers, parricide and so forth.
My innate calmness is defied when I am faced with anti-environmental, racist, homophobic, or populist discourses (which I notice sometimes scrolling among the comments on some article), so I preserve my well-being by not indulging on them too much – just the necessary time to get an idea of the public response to some news/issue. Therefore in the last case, I am actively working to stick to my bubble.

A fragmented vision of reality

Luciano Floridi is professor of philosophy of information, a discipline concerned with the investigation of the nature and dynamics of information. Given that we refer to the present age as the information society, this discussion is particularly relevant. Floridi warns us that by interacting exclusively with the people we like, and by exposing ourselves only to the contents that are in line with our system of values and beliefs, the world becomes a giant “me” that reflects our vision as if we were looking in the mirror.
The higher risk is with those who live in informational environments that are highly detached from reality. Online, people aggregate around common narratives, and some are completely unaware of living in a bubble, and polarisation and ideological filtering can be marked. This increases intolerance, reduces critical thinking, and determine states of separation among people. In this sense, while the web aggregates with social media, it contemporarily is an agent of disaggregation. Furthermore, people is often unaware of the fragmentation of information, because automatic digital processes support the bubble architecture, and we become increasingly exposed to tailored advertisement, pre-established choices, or suggestions that influence our perception.

Misinformation spreads quickly

The diffusion of ICT gave birth to the post-industrial information society and to the digital world, laying the foundation of new forms of communication such as the socialisation platforms. Today, the boundary between the physical reality and the online world is blurred and difficult to trace. Mobile phones and computers are embedded in our physical experience, and changed from being mere tools to being essential components of our lives. Our everyday experience has a hybrid nature, as the physical and the digital constantly interact: Floridi called this shift “onlife”.

An advantage of the digital shift is that information is more concise, allowing operations of conceptual simplification. However, this generated both an improvement and an impoverishment. Let us take the example of naive responses to complex arguments such as vaccines, immigration, or red meat. Internet promotes an unprecedented availability of information, along with conspiracy theories, rumours and false news, creating dangerous forms of ignorance. In fact, increased information does not equate with knowledge.
Confirmation bias refers to the tendency to look for information that confirms our point of view, and to ignore opposing information. By selecting only certain types of messages, we selectively interpret information reinforcing our beliefs, ignoring dissimilar narratives. Confirmation bias is especially evident in the way we use social media. A research on the consumption of scientific news and conspiracy theories among Facebook users, highlighted the presence of polarised clusters, characterised by high levels of homogeneity. More importantly, the researchers found that homogeneity is the main driver behind content sharing. Again, this is also supported by automatic processes. Research also observed the phenomenon of group polarisation in social media. Social psychology knows well that this occurs when people’s views are supported by other people of the same group, leading to the intensification of a specific belief. This is related to the confirmation effect, that is when we discover that others agree with us, we get more confident, eventually forming an opinion that may display a more radical version of the original belief. Group polarisation can be a serious issue, and the World Economic Forum recognised digital misinformation as a major social threat.
The world is not linear, and articulate world views are necessary for the maintenance of democratic, dialogue-based societies. Information bubbles create illusory and biased views, supporting intolerance among groups rather than tolerant or at least more open attitudes and opinion sharing. We should all be careful about our ideas, and make an effort towards the expression and dissemination of more informed contents.