Filter Bubbles in the Information Society

How is life in your bubble? In mine, people travel a lot and have international friends, the world is a big community; most of the people possess university education; many of my friends work in the design or creative industry, many are scholars, researchers or psychologists. This is my Facebook bubble. I admit that I do not spend much time on Facebook, but indeed, I check it almost every day. I have availability of high quality information because I liked different pages (e.g. The Guardian, The Economist, BBC, Al Jazeera, La Repubblica, The Moscow Times), and issues such as climate change are highly prominent (I liked a lot of environmental NGOs). I hate news about creepy crimes so I totally filter or avoid that information. I also cannot stand intolerant, racist, populist discourses, so I try to avoid this information too. It took me a month to adapt to the fact that Donald Trump had been elected, so I skipped mainstream media for  the first two weeks. I guess that I am a bit intolerant myself.

A fragmented vision of reality

Luciano Floridi is a professor of philosophy of information, a discipline concerned with the investigation of the nature and dynamics of information. Given that we refer to the present age as the society of information, this discussion is particularly relevant. Floridi warns us that by interacting exclusively with the people we like, and by exposing ourselves only to the contents that are in line with our system of values and beliefs, the world becomes a giant “me” that reflects our vision as if we were looking in the mirror.

The higher risk is with those people that live in informational environments that are highly detached from reality. Online, people aggregate around common narratives, and some are completely unaware of living in a bubble, and polarisation and ideological filtering can be marked. This increases intolerance, reduces critical thinking, and determine states of separation among people. In this sense, while the web aggregate with social media, it contemporarily is an agent of disaggregation. Furthermore, people is often unaware of the fragmentation of information, because automatic digital processes support the bubble architecture, as we become increasingly exposed to tailored advertisement, pre-established choices, or suggestions that influence our perception.

Misinformation spreads quickly

The diffusion of ICT gave birth to the post-industrial information society, and the digital world along with its new socialisation platforms, laid the foundation of new forms of information and communication. Today, the boundary between the physical reality and the online world is blurred and difficult to trace. Mobile phones and computers are embedded in our experience, moving from being mere tools, to essential components of our lives. Our everyday experience has a hybrid nature, as the physical and the digital constantly interact: Floridi called this shift onlife.

An advantage of the digital shift is that information is more concise, allowing operations of conceptual simplification. However, this generated both an improvement along with an impoverishment. Let us take the example of naive responses to complex arguments such as vaccines, immigration, or red meat. Internet promotes an unprecedented availability of information, along with conspiracy theories, rumours and false news, creating dangerous forms of ignorance. In fact, increased information do not equate with knowledge.

Confirmation bias refer to the tendency to look for information that confirms our point of view, and to ignore opposing information. By selecting only certain types of messages, we selectively interpret information, reinforcing our beliefs, and ignoring dissimilar narratives. Confirmation bias is especially evident in the way we use social media. A research on the consumption of scientific news and conspiracy theories among Facebook users, highlighted the presence of polarised clusters, characterised by high levels of homogeneity. More importantly, the researchers found that homogeneity is the main driver behind content sharing. Again, this is also supported by automatic processes. Research also observed the phenomena of group polarisation in social media. Social psychology knows well this phenomenon that occurs when people’s views are supported by other people of the same group, leading to an intensification of the specific belief. This is connected to the confirmation effect, since when we discover that others agree with us, we get more confident, eventually forming an opinion that may be a more radical account of the original belief.

The World Economic Forum has acknowledged digital misinformation as a major societal threat. The world is not linear, and articulate worldviews are necessary for the maintenance of democratic dialogue-based societies. Information bubbles create illusory and biased worldviews, supporting intolerance among groups rather than tolerant attitudes and opinion sharing. We should all be careful about our ideas, and make an effort towards the expression and dissemination of more tolerant and informed contents.

Advertisements