Stories are central to our society. They have been our way of exchanging and remembering knowledge for thousands of years. Stories and their narratives create knowledge. By controlling narratives, one can control knowledge, which has important consequences in our society. The extent to which one can have control over knowledge stems from the authority one represents. For example, anti-Islamic propaganda during the crusades in the Middle-Ages was issued by the church, the most powerful institution in society at the time. Stories can also be used to control knowledge and eradicate certain forms of knowledge. Another example is how popular tales of witches were used to facilitate the hunt and execution of female doctors and botanists in the middle ages. This led to the loss of a large part of the knowledge they had (Chollet, 2018). The question “who controls the story” is crucial to understand the power dynamics at play behind the knowledge we consume.
With the development of Information and Communication Technologies (ICTs), our approach to knowledge changed. In the course “Machines of Knowledge”, we discussed the evolution of the internet, which included its role as a space to share stories and create knowledge. In the assignment “Who controls the narrative?”, we discussed how players such as Facebook and Google dominate the online information market and have a disproportionate control over the information we receive. They control and curate almost infinite amounts of information using algorithms. Algorithms can work in different ways but are aimed at channeling relevant information for the user. To do so, they use preferences, previous research history, or even localisation and friends’ preferences (Bozdag, 2013).
This algorithmic catering of the stories we have access to can have harmful consequences. For example, algorithms are found to create filter bubbles and echo chambers. Eli Pariser, who first introduced the concept, argues that filter bubbles are
“information intermediaries [that] silently filter out what they assume the user does not want to see, hiding information posted by [the] opposite end of [the] political spectrum”
Pariser, E. (in Bozdag, 2013, p.218)
Echo chambers refer to a similar phenomenon in which users only interact with other users who share similar opinions and hence reinforce their common perspectives without considering opinions viewed as challenging or critical. Those phenomena are linked to the rise of political polarisation (Spohr, 2017). Hence, what started as a tool to efficiently sort through information turned into one that polarises our society.
One might think that the time of the crusades, when a population would blindly believe a polarised narrative in its most extreme form “the good against the evil, the light against the darkness” is centuries behind us. However, the impact stories can have on our societies is magnified when they are shared through the internet as they are available globally and instantly and issues like fake news and the rise of political extremism around the world might suggest that we are still at the mercy of some narratives.
However, spaces for debates and conversations between opposite opinions do exist on the internet (if you want to know more about it, read this post). Although some users appear to cross over filter bubbles, efforts must be made to circumvent filter bubbles and echo chambers and offer equal access to all kinds of stories on the internet.
References:
-Chollet, M. (2018). Sorcières – La puissance invaincue des femmes. Zones
– Bozdag, E. (2013). Bias in Algorithmic Filtering and Personalization. Ethics In Technology, 15, pp. 209-227.
– Spohr, D. (2017) Fake news and ideological polarization: Filter bubbles and selective exposure on social media. Business Information Review, 34 (3), pp. 150-160.