The widespread dissemination of new means of communication and new technologies used to ensure their functionality immerses all of us in previously unknown phenomena and definitions, the essence and meaning of which are not immediately revealed. It seems that the consequences and the side effects of many phenomena of these spheres are completely unknown to anyone. We are left with the feeling of confusion and in order to try to understand what we are dealing with we plunge into the depths of the Internet. However, the deeper we go there, the more complicated this virtual reality becomes.
If you have noticed how quickly news sites offer you helpful information, then you, without knowing it, have fallen into a “filter bubble” or an “information bubble”. Moreover, each user is located inside his/her “information bubble”. Eli Pariser, the person who first discovered this phenomenon and described it in his book “The Filter Bubble: What the Internet is Hiding from You” [Pariser, 2011] argues that the Internet controls us and, by providing information, at the same time restricts us in information … The Cambridge Dictionary offers this definition of the filter bubble: “a situation in which someone only hears or sees news and information that supports what they already believe and like, especially a situation created on the internet as a result of algorithms (= sets of rules) that choose the results of someone’s searches” [Cambridge Dictionary, n.d.].
The algorithms and filters used by the largest Internet companies show what is most relevant for each particular visitor, creating a kind of “comfort zone”. Whether it’s web browsing, watching the news or ordering goods, companies such as Google, Facebook, etc. offer the users the information that seems most appropriate and relevant. Pariser gives a very simple example that proves the validity of his theory: when two people type in the Google search program only one word – the name of the country, the first one receives information about the political structure of this country, while the second – about its tourist routes. Isolated in a limited information space, very often one also does not suspect that there are other options for information, choices and points of view. As soon as you go online, the algorithms of the platforms you use, keep track of what grabs your attention. These algorithms provide the user with content based on what the algorithm thinks he/she likes and they will continue to do so until they show the content that each user is likely to consume. This process leads to the creation of the filter bubble.
Many experts have identified this phenomenon as an “online manipulation”, that is, an attempt to influence individual decision-making and behavior, control and manage people using information technology [Susser et al., 2019]. This manipulation involves deliberately and covertly influencing people’s decision making in a way that they are not aware of. In his book, Paraizer discovers how the modern Internet works and warns of the dangers of being isolated in the filter bubble. Search algorithms use large amounts of user data to find and present relevant information to the individual visitor each time this person enters the Internet. Search and browsing history are a key piece of data used to tailor the results you get when you browse the web. Moreover, social media, personal preferences, geography, email, mobile notifications and several other factors affect the information a user finds.
On one hand, these algorithms play the role of a navigator in the ocean of information and help to quickly find the area and the data that would otherwise take much longer to find, and we all value our time very highly. However, by raising awareness in some areas, algorithmic systems cut off other areas, depriving you not only of information but possibly of obtaining an objective picture of the situation with differences of opinion, alternative positions and important opportunities.
Being inside the information bubble can have completely different consequences for different groups of people. For example, repetitive, obtrusive advertising of products and services can either induce the purchase or cause frustration and annoyance, which will only affect the personal mood or budget of an individual or a family. E-commerce sites have long used recommendation systems to filter, sort and suggest their products as well as the media we use on the Internet. Things get more complicated when the influence of algorithms spreads to public spheres, especially when it comes, for example, to political discussions and the phenomenon of political polarization, racial or religious differences in assessing social status, gender preference in posting job advertisements, the degree of tolerance of the AI control in people’s private lives or a very relevant topic of vaccinations. The consequences of distorted information, such as, for example, the provision of exclusively conservative news to conservative users and liberal news to liberal users, can lead to errors and conflicts at the level of society, to a confrontation of social groups and failed decisions in governance by the authorities. In April 2019, concerns about this and understanding the need for control led a group of US Senators to propose the Algorithmic Accountability Act (AAA), in which they expressed concern that “automated decision-making systems” could exhibit bias and discrimination, especially when it comes to issues such as privacy and security [Booker, 2019].
However, experts believe that blaming these problems on algorithms and mechanisms of algorithm formation means ignoring the second important component of the system – the people themselves, who provide a huge amount of data to the entire chain of actions within the system, producing its dynamics and direction [Hosanagar and Miller, 2020]. Moreover, empirical studies conducted by Seth Flaxman (Oxford University), Sharad Goel (Stanford University) and Justin Rao (Microsoft Research) show that we live in echo chambers that we create; algorithms only marginally enhance these bubbles [Schwab, 2017]. By clicking the buttons “share”, “like”, “subscribe”, like-minded people gather in homogeneous groups that share the same points of view and collect information from the same sources. Depending on the set of ideas and information data, such echo chambers can carry and implement both, a significant positive charge and accumulate negative impulses, as evidenced by various studies conducted by specialists on the impact of IT technologies on society. [fondationdescartes, 2020]
It is obvious that the phenomenon of information bubbles and echo chambers requires empirical studies, given how widely the Internet space and communication technologies have entered our personal and social lives. In this situation, only objective knowledge can save us both from exaggerated fears and frivolous ignorance of the problem. Praemonitus – praemunitus. Forewarned is forearmed.
Booker, Cory (2019). Booker, Wyden, Clarke Introduce Bill Requiring Companies to Target Bias in Corporate Algorithms [Press release]. Senate.gov. Retrieved from https://www.booker.senate.gov/?p=press_release&id=903. Accessed on 26.01.2021.
Cambridge Dictionary. Filter-bubble. Retrieved from https://dictionary.cambridge.org/ru/%D1%81%D0%BB%D0%BE%D0%B2%D0%B0%D1%80%D1%8C/%D0%B0%D0%BD%D0%B3%D0%BB%D0%B8%D0%B9%D1%81%D0%BA%D0%B8%D0%B9/filter-bubble 26/01/2021/. Accessed on 22.01.2021.
Fondation Descartes (2020). Filter Bubbles and Echo Chambers. Retrieved from https://www.fondationdescartes.org/en/2020/07/filter-bubbles-and-echo-chambers/. Accessed on 26.01.2021.
Hosanagar, Kartik, Miller, Alex (2020). Who Do We Blame for the Filter Bubble? On the Roles of Math, Data, and People in Algorithmic Social Systems. In K. Werbach (Ed.), After the Digital Tornado: Networks, Algorithms, Humanity (pp. 103-121). Cambridge: Cambridge University Press. Retrieved from https://www.cambridge.org/core/search?filters%5BauthorTerms%5D=Alex%20P.%20Miller&eventCode=SE-AU. Accessed on 26.01.2021.
Pariser, Eli (2011). The Filter Bubble: What the Internet is Hiding from You. Penguin UK, 304p.
Schwab, Pierre-Nicolas (2017). Academic research debunks the myth of filter bubbles. Retrieved from https://www.intotheminds.com/blog/en/academic-research-debunks-the-myth-of-filter-bubbles/. Accessed on 21.01.2021.
Susser, Daniel, Roessler, Beate, Nissenbaum, Helen (2019). Technology, autonomy, and manipulation. Internet Policy Review, 8(2). Retrieved from https://policyreview.info/articles/analysis/technology-autonomy-and-manipulation. Accessed on 21.01.2021.
Note: The views expressed in this blog are the author’s own and do not necessarily reflect the Institute’s editorial policy.
Nadirova Gulnar Ermuratovna graduated from the Oriental Faculty of Leningrad State University, in 1990 she defended her thesis on the Algerian literature at the Moscow Institute of Oriental Studies, in 2006 doctoral thesis - on modern Tunisian literature at the Tashkent Institute of Oriental Studies, Professor.