Translate this page:
Please select your language to translate the article


You can just close the window to don't translate
Library
Your profile

Back to contents

Sociodynamics
Reference:

Freedom of choice and network "echo effects" of information consumption

Leontyev Gleb Dmitrievich

PhD in Philosophy

Associate Professor, Department of General Philosophy, Kazan (Volga region) Federal University

35 Kremlevskaya str., Kazan, 420008, Russia, Republic of Tatarstan

leontyeval@icloud.com
Other publications by this author
 

 
Leontieva Ludmila Stanislavovna

PhD in Philosophy

Associate Professor, Department of State and Municipal Administration, Kazan (Volga Region) Federal University

420008, Russia, Republic of Tatarstan, Kazan, Kremlevskaya str., 18

lsl3@yandex.ru
Other publications by this author
 

 

DOI:

10.25136/2409-7144.2024.1.68951

EDN:

KWSJOI

Received:

12-11-2023


Published:

02-02-2024


Abstract: The mechanisms of media personalization and separation in network communication on the principle of like-mindedness stimulate the formation of "echo chambers" and "filter bubbles". The social network phenomenon indicated by these metaphors is the subject of research, and the concepts themselves are considered as close, but not identical. The purpose of the study is to identify factors affecting the effectiveness of filtering algorithms, the user's ability to make informed choices and self–organization in the process of information consumption. The analysis of the causes and consequences of selective strategies of online information consumption is carried out on the basis of the communicative - activity approach, the theory of virtual reality Zh. Baudrillard, concepts of digital media by K. Sunstein, E. Pariser and R. Fletcher. The results of foreign and domestic studies of communication practices on various social platforms are used as an empirical basis. The research approaches presented in the scientific literature allow us to focus on the technological and logical-semantic perspective of the analysis of stable forms of network communication. According to the authors, the interdependence of filtering algorithms and value dominants of information consumption leaves the user with a chance to independently choose a "consumer basket". The duality in obtaining personalized content is emphasized: on the one hand, it is convenience, saving effort, on the other - the one-dimensionality of the world picture in the information bubble. Based on this, freedom of choice is characterized as the right to active choice and the right not to choose, consciously delegating it to neural network filters. In conclusion, the authors identified internal and external network factors for reducing the effectiveness of filtering algorithms: the interpretation of user behavior by artificial intelligence; the functioning of rational confrontational communication; the opportunistic conceptualization of echo effects; the availability of means of conscious counteraction. Incentives for reasonable information consumption, technological and cognitive ways of protecting and rationalizing user behavior are highlighted.


Keywords:

information, choice, information consumption, filtering algorithms, confirmation bias, filter bubble, echo chamber, communication, online platforms, virtuality

This article is automatically translated.

 

The dynamics of digitalization and mediatization of society generates both technological and socio-communicative effects. Online social media platforms are technologies for virtualizing social reality. "Humanity," according to J. Baudrillard decided to clone his physicality and his possessions in another universe, different from the previous one," in which "simple informativeness, calculability, and computability reign" [1]. Virtuality simulatively replaces reality, intensely absorbing the life world of everyday life. The speed, scale, and depth of information penetration through social network interactions is comparable to the virulence of an infectious agent. Following the recent pandemic, in line with epidemiological analogies, infection with viral information is associated with the immunological resistance of the body, with the ability of the pathogen to overcome protective barriers. That is, the "virulence" of virtual objects depends both on the properties, "packaging" of information products, and on the susceptibility of an individual and a community to mass information.

 The ability of virtuality to create the illusion of its own reality can be traced by the example of the client-oriented formation of a "unique information universe for each of us" [2]. This figurative formulation belongs to the American political Internet activist Eli Pariser and means, the term he introduced, "bubble filter" - a bubble of algorithms, filters, an information bubble. Externally, the information bubble looks like "an ecosystem of familiar applications, selected platforms and news aggregators, as well as a set of "subscriptions" and personal "representations" (pages) on social networks" [3, p. 13]. The listed information intermediaries by selective information supplies save the human psyche from information overload, but the same selection algorithms narrow the field of view, providing information that strengthens the preferences of the individual and confirms his rooted ideas. As a result, the effect of an acoustic wave reflected from the walls of the bubble chamber is created, which the recipient accepts with satisfaction on the principle of "quietly with himself" and with his own kind. Professor Cass Sunstein warned about this effect more than two decades ago in his book "Echo Chambers" [4]. The intensive development of social network interactions, information redundancy in conjunction with information consumerism actualize the problem of side effects of online communications. Therefore, the purpose of this article is to identify factors affecting the effectiveness of content filtering algorithms on social platforms, on the user's ability to make an informed choice in the process of information consumption.

Echo techniques in the psychology of communication appeared long before the era of social networks, they are based on the effect of enhancing exposure through verbal and non-verbal repetition. In online communications, the echo effect is created by selective strategies of network behavior, which are based on the human need for self-identification, acceptance and confirmation of one's own feelings and reflections from the outside, in increasing one's strength and resources through belonging to a community. Therefore, the echo chamber is a "complex phenomenon of spontaneous formation of stable forms of network communication" [5], i.e. a situationally conditioned space of like-mindedness. According to the Global Digital 2023 report, 64.4% of the world's population has Internet access, 60% use social networks, and the average time spent on networks is 4 minutes out of every 10 minutes online. 30% of respondents named social networks as a source of receiving news on the Internet, search engines – 25%, aggregators – 8%. The influence of intermediary platforms is growing, and media sites and applications are used less and less: in 2018 - 32%; in 2023 – 22% [6]. Compared to direct access to familiar news sites, the algorithms of the platforms increase the degree of diversity of the sources of information received. At the same time, search engines are configured based on the preferences of users who make their choice based on the suggestions of algorithms. "You find yourself trapped in an "I" loop" discovered by Eli Pariser more than a decade ago: "Your identity shapes your media consumption, but then the media shapes your beliefs and areas of interest" [2, p. 139]. The mechanisms of media personalization are combined with basic cognitive distortions, such as "selective receptivity" and "confirmation bias", which migrated from the offline environment to the online space together with the media subjects, since they allow users to avoid differences that lead to psychological discomfort.

A quantitative assessment of the importance of the motive of "confirmation of beliefs" in the formation of demand for news was given by American researchers (F. Chopra, I. Haaland, Chris. Roth) [7] based on several large-scale experiments with voters. The willingness of people to subscribe to the newsletter was studied depending on the presentation of news content: from a politically impartial approach with reliable facts to a biased, politically biased (right, left). Key preference parameters showed that respondents reduce their demand for "biased news" only if this type of bias is incompatible with their own political beliefs. Noting the ability of biased news to increase political polarization and the growth of populism, the researchers noted the presence of a consumer compromise: the equivalence of the motive of information reliability and the motive of "confirmation bias" [7, p. 29].  Consequently, the objectivity, evidence, and accuracy of the social facts provided to the audience does not guarantee media success. The basis of modern information and economic relations, the so-called "supervisory" (Shoshana Zuboff) [8], "platform" (Nick Srnicek) [9], "communicative" (Jody Dean) [10] capitalism is a marketing adjustment to a specific consumer, optimization of the technological chain of creating a journalistic product as a sought–after product. The delivery of individually relevant information, that is, the possibility of personalization or targeting is feasible, thanks to the digital footprint of the user. For example, analyzing likes/dislikes, taking into account previous user actions with similar content, posts, and search history allow you to create a digital portrait that includes not only a list of preferences, but also predictive scenarios of media consumer behavior.

The peculiarities of consumer behavior caused by the considered psychological mechanisms of "bias" and "selectivity" are not only the consumption of thematically defined, value-semantic content, but also the orientation of social interactions on an online platform. Taking into account this specificity, scientists from the University of Oslo (M. Cinelli, G. de F. Morales, A. Galeazzi, M. Starnin) studied the communications of more than one million active users on four social media platforms, in total - more than one hundred million unique fragments on controversial, socially significant issues. The presence/absence of echo chambers was assessed according to two criteria: homophilia in interactions on a specific topic and bias in the dissemination of information from like-minded sources. The results of a study published in 2021 showed that the degree of segregation in news consumption varies depending on the platform, but in general, "aggregation of users into homophile clusters dominates online interactions" [11], i.e. social media platforms and news feed algorithms contribute to the emergence of echo chambers.

The filter bubble and the echo chamber are, at first glance, identical concepts. However, the way they are formed has differences, which Dr. Richard Fletcher points out: "echo chambers can be the result of filtration or other processes, whereas a filter bubble is the result of algorithmic filtration" [12]. If in the first case users consciously filter information, then in the second this task is performed by the algorithms of the platforms, i.e. the criterion for the difference of the concepts under consideration is a method of selection. The approach proposed by the Oxford researcher of digital news consumption models can be defined as socio-technological. In the Russian scientific literature, along with the socio-technological one, a cognitive-communicative, logical-semantic perspective of analysis is proposed. Professor V. A. Bazhanov notes that "the difference between the two communicative spaces lies in a certain tolerance for dissent in the case of echo bubbles and an intolerant attitude towards it in the case of echo chambers" [13, p. 155].  If "epistemic echo bubbles" are formed due to the similarity of people's views (in a broad sense "on life", including political) and the emotions accompanying these views" [13, p. 152], then the "echo chamber" is a closed communicative formation, "strengthening, expanding a kind of epistemic control over the state of minds and forming special structures for countering and exposing the authoritative opinions of the opposite side" [13, p. 156]. When these approaches are schematically superimposed, revealing the differences between "echo chambers and bubbles", it turns out that the machine algorithm for the formation of "echo bubbles" is more tolerant than consciously constructed filters of "echo chambers" with a one-sided distortion of reality perception, with dogmatics and intra-group cognitive attitudes. Despite the conditionality of this mental operation, its result emphasizes the importance of the thoughts and actions of the user himself, contrary to the idea of the omnipotence of filtering algorithms. If the user's installations and selective communications are able not only to form data filtering algorithms, but also to enhance their work, then we can assume the user's ability to neutralize or bypass filters.

Any machine filtering algorithms or psychological traps are manipulations that suppress, but do not cancel, freedom of choice. A meaningful choice involves the activity of the mind on the basis of private interest, expediency and logical ordering, taking into account real conditions, the relationship of goals, means and the external environment. The ability of modern users to rationalize behavior in the network media environment, reducing the effectiveness of the impact of algorithmic filtering are due to the following factors:

Firstly, the predictive abilities of artificial intelligence can fail due to difficulties in interpreting user behavior: inconsistency in views and preferences, argumentative tricks, logical errors, aesopian language, "social automatism" [14] of network interactions. To recognize the "pragmatics of human information manipulation" [15], a machine algorithm could benefit from competencies that include the Popper principle of falsification - refutation of one's own ideas.  The differences in the status and role self-presentation of users are due to the fact that they are not limited to one social platform, but create accounts in different social networks, on websites and online services, or use unregistered access to materials, to discussions on open forums. "The content promoted or hidden by the algorithms of the platforms is mostly created by people and therefore is not without contradictions" [16]. For example, the clickbait phenomenon worsens the user experience, but the author of the content gets an advanced position in the output due to the inadequate reaction of the system, i.e. the algorithms do not calculate the lack of semantic connection between the title and the content of the material. In addition to interpreting the "subjectivisms" of users, algorithms need "additional education" to understand the social context and, according to Professor Y. Harari, to correct mistakes made by programmers at the level of "subconscious biases" [17].

Secondly, the adoption of the social and value attitudes of the network community does not entail the obligatory uniformity of thoughts and actions of all its members in all spheres of network existence. For example, a study of one hundred and fifteen virtual communities of the Russian social network Vkontakte allowed experimenters to conclude that not only echo chambers function, but also rational confrontational communication. "The blocking of critical accounts did not exceed 30% of the number of communities included in the sample for each of the ideological groups" [18, p. 82].

Thirdly, the conceptualization of negative network effects, according to media theorist Axel Bruns, may have a conjunctural motivation on the part of some media politicians and technopessimists [16, pp. 16-18]: either in order to tighten the regulation of social media, or in order to reduce the independent activity of users who believe in strict dependence on algorithmic services of online platforms.

Fourthly, social technologies of influencing mass consciousness have existed since ancient times, as well as means of conscious counteraction. The preservation of "human agency" to the extent necessary for the user is achievable by the efforts of the user himself. This goal is served not only by information technology, but also by cognitive and psychological methods of protection.

In terms of information technology, self-preservation measures available to the average user in the algorithmic media space involve disabling recommendations in video hosting settings, disabling application access to the microphone and abandoning the voice assistant (audio privacy); abandoning automatic sorting of publications on networks, from the personalization function in the search service, as well as systematic cleaning of the history of search queries and views video content.

By focusing on the "degree of agency" demanded by the user himself, we emphasize not only the negative, but also the positive sides of personalized content filtering. The duality is explained by the convenience of receiving recommendations selected by algorithms. A machine hint or, in R. Avenarius's terminology, work on the "principle of least waste of effort" is optimal in a standard situation, for example, when searching for information of a reference, descriptive nature. The independent choice of consumed content (for example, subscription) and following the recommendations of algorithms in network practice, in our opinion, do not exclude each other, but complement each other, are applied according to the situation.  The default settings are a convenient option in the "architecture of choice" [19], says Professor Cass Sunstein, giving an example of using a navigator, whose suggestions can be rejected if desired. In his book "The Illusion of Choice", the professor, defending the right to active choice, at the same time considers the "choice not to choose" as one of the ways to manifest free will. In the case of conformal behavior, decision-making is delegated to experts, politicians and algorithms. Such "default settings" are akin to the psychological mechanism of stereotyping, when the mode of "saving thinking" is turned on and you can act according to a template, based on indirect experience and ready-made, simplified ideas.

The user's need for an in-depth analysis of a specific problem requires rational behavior in the information sphere as well as in any other area of life. Information technology methods of protection in combination with cognitive and psychological methods help a person to preserve his own subjectivity. The formed echo chamber can be significantly reduced or abolished only by "the disappearance of the event itself that led to its appearance" [13, p.159]. The risk of being in the "echo chamber" is minimized at the initial "launch" of a critical attitude to information, since the level of confirmation bias is reduced. This is evidenced by the results of an experiment by American researchers [20] who studied the impact of fake news on social networks. The participants of the experiment, bachelors from the Indiana University School of Business, had to express their attitude to news materials. The trust was aroused by articles that coincided with their own point of view ("confirmation bias"), but when creating the so-called "user rating" of articles, the participants took a critical position. The task was to publicly assess the reliability of information about which the participants of the experiment had no personal experience, which led to doubt and caution in information exchange. Consequently, the researchers concluded, the news does not become viral when the critical filter is activated, since it reduces the demand for biased content.

 Critical attitude to information is associated, on the one hand, with the degree of accessibility and diversity of sources, and, on the other hand, with the level of openness of the user himself to messages that do not fit into the idealized picture of individual perception. The breadth of views, the willingness to understand the opponent's position contribute to the conscious acceptance of alternative arguments through doubt, comparative analysis, the search for similarities and comprehension of the essence of differences. At the same time, the argumentative meaningfulness of the conclusion does not exclude a state of psychological discomfort: from a feeling of one's own cognitive insufficiency to a crisis of self-identification. The consistent implementation of the installation to go beyond the echo-resonance bubble activates the internal locus of user control, changing the perspective of self-assembly and the content of information needs.

Incentives for reasonable information consumption arise on the basis of: 1) lack of available information for the implementation of certain actions; 2) contradictions between available information and information coming from outside; 3) contradictions between media news, theoretical, ideological innovations and everyday practice. As a result of the awareness of inconsistencies, a "search image" of the information need is formed, i.e. an idea of one's own information gaps in the amount of information that is in demand to solve the problem. An independently identified trend of information search leads to the desired goal, bypassing externally set, algorithmized "landmarks". With the hypothetical transparency of the filters of online platforms, the user has the opportunity to interact with the algorithms under his control, setting a conscious trajectory of his own development and search. In both cases, the lack of information determination on the part of algorithms takes the user outside the information-closed community.

Thus, the collection, storage, dissemination and monetization of information about possible consumer behavior stimulate the formation of filter bubbles. The trend towards separation between online communities based on consumed content is confirmed by research experiments on various social platforms. The ability of artificial intelligence to self-study increases the attractiveness and accuracy of personalized recommendations. However, it is premature to talk about the totality of self-contained echo chambers, about the impenetrability of the boundaries of network communities. Communication and reasonable information consumption, awareness of assessments and conclusions increase the degree of adequacy of perception of reality in all its contradictory diversity, reduce the likelihood of rooted stay in a comfortable information bubble.

Delegating decision-making to artificial intelligence, i.e. "choosing not to choose", or preserving and strengthening one's own subjectivity, contrary to smart algorithms and neural network filters, is a conscious, situational choice of the user himself, his self-organization in the process of responsible information consumption.

References
1. Baudrillard, J. (2006). Passwords. From fragment to fragment. Translated from French N. Suslov. Ekaterinburg: Publishing House «U-Faktoriya».
2. Pariser, E. (2012). The Filter Bubble: What the Internet is Hiding from You? Translated from English A. Shirikova. Moscow: Publishing House «Alpina Business Books».
3. Gurov, F. N. (2019). The Experience of social and philosophical explanation of the problem of “fakes” and “filter bubbles” on the web. Problems of modern education, 3, 9-20. Retrieved from http://www.pmedu.ru/index.php/ru/2019-god/nomer-3
4. Sunstein, C. R. (2001). Echo chambers. Princeton: Princeton University Press.
5. Zamkov, A.V. (2019). The Echo Chamber Effect as a Manifestation of the Principle of Self-Similarity on Social Networks. Mediascope, 2. doi:10.30547/mediascope.2.2019.7
6. Digital 2023: Global Overview Report. Retrieved from https://datareportal.com/reports/digital-2023-global-overview-report
7. Chopra, F., Haaland, I. & Roth, Ch. (2022). The Demand for News: Accuracy Concerns Versus Belief Confirmation Motives. CESifo Working Paper, 9673. doi:10.2139/ssrn.4082578
8. Zuboff, S. (2019). The Age of Surveillance Capitalism. New York: PublicAffairs.
9. Srnicek, N. (2019). Platform capitalism. Translated from English M. Dobryakova. Moscow: Publishing House «HSE».
10. Dean, J. (2005). Communicative Capitalism: Circulation and the Foreclosure of Politics. Cultural Politics, 1(1), 51-74.
11. Cinelli, M., Morales, G. F., Galeazzi, A., & Starnin, M. (2021). The echo chamber effect on social media. PNAS. Computer Sciences, 118(9). doi:10.1073/pnas.2023301118
12. Fletcher, R. (24th January 2020). The truth behind filter bubbles: Bursting some myths. Reuters Institute for the Study of Journalism University of Oxford. Retrieved from https://reutersinstitute.politics.ox.ac.uk/news/truth-behind-filter-bubbles-bursting-some-myths
13. Bazhanov, V. A. (2022). Cognitive mechanisms in the era of information: «echo-bubbles» and «echo-chambers». Philosophy Journal, 15(4), 152-164. doi:10.21146/2072-0726-2022-15-4-152-164
14. Safina, A. M., Leontyev, G. D., Gaynullina, L. F., Leontieva, L. S., & Khalilova, T. V. (2018). Dialectics of freedom and alienation in the space of the internet. Revista ESPACIOS, 39(27), 8. Retrieved from http://www.revistaespacios.com/a18v39n27/18392708.html
15. Leontyev, G. D., & Leontieva, L. S. (2023). Öèôðîâàÿ òåõíî-äåìîêðàòèÿ êàê ïîñòíåêëàññè÷åñêàÿ ïðàêòîïèÿ [Digital techno-democracy as a post-non-classical praktopia]. Sociodynamics, 4, 1-10. doi:10.25136/2409-7144.2023.4.40407
16. Bruns, A. (2023). Are Filter Bubbles Real. Translated from English A. Arhipovà. Moscow: Publishing House «HSE».
17. Harari, Yu. N. (2020). 21 Lessons for the 21st Century. Moscow: Publishing House «Sinbad».
18. Martyanov, D. S., & Martyanova, N. A. (2019). Ñåëåêòèâíàÿ ìîäåðàöèÿ â óñëîâèÿõ âèðòóàëüíîé ïóáëè÷íîé ñôåðû [Selective moderation in conditions of virtual public sphere]. Sociodynamics, 12, 74-85. doi:10.25136/2409-7144.2019.12.31759
19. Sunstein, C. R. (2016). Choosing Not to Choose: Understanding the Value of Choice. Moscow: Publishing House «Alpina Publisher».
20. Moravec, P., Kim, A., Dennis, A. R., & Minas, R. (October 22, 2018). Do You Really Know If It’s True? How Asking Users to Rate Stories Affects Belief in Fake News on Social Media. Kelley School of Business Research. Paper 18-89. doi:10.2139/ssrn.3271057

First Peer Review

Peer reviewers' evaluations remain confidential and are not disclosed to the public. Only external reviews, authorized for publication by the article's author(s), are made public. Typically, these final reviews are conducted after the manuscript's revision. Adhering to our double-blind review policy, the reviewer's identity is kept confidential.
The list of publisher reviewers can be found here.

In the peer-reviewed article "Freedom of choice and network "echo effects" of information consumption", the subject of research is network "echo effects" of information consumption. The purpose of the study is not explicitly indicated, which makes it difficult to understand the author's idea. The research methodology is based on a comparison of socio-technological (exploring the process of algorithmization of the issuance of news content) and cognitive-communicative (offering a logical and semantic perspective of echo effects). This allows us to show in the work that the concepts of "echo chamber" and "filter bubble" are not identical. To date, the understanding of the processes associated with the development of information and communication technologies causes heated discussions in the scientific community and requires not only an analytical description, but also deep philosophical reflection. It is philosophy, realizing its integrative function, that evaluates modern digital and information and communication technologies and formulates the main methodological principles for understanding the process of informatization, digitalization, virtualization, and mediatization. The publication offers the author's version of understanding the echo effects that arise from the simplification and standardization of communication practices, political and marketing technologies for managing the attention of the subject. The paper shows the difference between two phenomena arising in the virtual space: the "echo chamber" and the "filter bubble". If in the first case users consciously filter information, then in the second this task is performed by the algorithms of the platforms, i.e. the criterion for the difference of the concepts under consideration is a method of selection. Noteworthy is the conclusion that "Any machine filtering algorithms or psychological traps are manipulations that suppress, but do not cancel freedom of choice." According to the author, delegating decision–making to artificial intelligence is also a choice ("the choice is not to choose"). But it is more important, despite smart algorithms and neural network filters, to preserve and strengthen one's own subjectivity, which contributes to more responsible information consumption. This study is characterized by general consistency and literacy of presentation. The content meets the requirements of the scientific text. The article has a good level of philosophical reflection on the "echo effects" of information consumption. However, it is questionable whether it is necessary to use some purely medical terms without quotation marks, for example, "virulence". Next, it is worth paying the author's attention to the style of presentation, since in some cases there are two parts of a complex sentence that do not agree with each other. For example, "In the virtual, "simple informativeness, calculability, and computability reign," this is a special kind of information that absorbs the life world of everyday life, simulatively replacing reality." It is not clear from this phrase, but what kind of special type of information it is. A little earlier, the author writes: "Online social media platforms are technologies for virtualizing the social, and society, according to J. Baudrillard has an "undisguised attraction to the virtual and related technologies." It seems to follow from the phrase that the author equates the social with society. The bibliography of the publication is generally sufficient. It includes 20 publications in both Russian and foreign languages. Thus, the appeal to the main opponents from the area under consideration is fully present. However, the bibliography is not fully designed in accordance with the requirements of the journal. In addition, the need to quote in the original language is questionable. Conclusion: The article "Freedom of choice and network "echo effects" of information consumption" has scientific and theoretical significance. It will be of interest to specialists in the field of cultural philosophy. The work can be published after putting the bibliographic list in order and some editorial edits of the text.

Second Peer Review

Peer reviewers' evaluations remain confidential and are not disclosed to the public. Only external reviews, authorized for publication by the article's author(s), are made public. Typically, these final reviews are conducted after the manuscript's revision. Adhering to our double-blind review policy, the reviewer's identity is kept confidential.
The list of publisher reviewers can be found here.

The reviewed article is a compact (0.5 a.l. without taking into account the bibliography), but rather informative study of some features of human interaction and the virtual world, in the process of which there is a danger of influencing the user's ability to make an informed choice of goods, services, and information. It cannot be said that addressing this topic looks innovative today, on the other hand, the author considers precisely those aspects of the problem that have so far been insufficiently actively analyzed in publications devoted to the social effects of the spread of information technologies. In particular, the author draws attention to the difference between the concepts of "filter bubble" and "echo chamber", pointing out that if "in the first case, users consciously filter information, then in the second case, this task is performed by platform algorithms, i.e. the criterion for the difference of the concepts under consideration is a way of selecting" information or commercial offers. It should be noted that the author handles the literature used very expertly. Unfortunately, references to various sources are often given in today's publications to demonstrate erudition, without adding new information or expanding the range of readers' ideas about the issue under consideration. In this case, all references and citations are relevant and informative, which indicates the professionalism of the author and his ability to select only fragments of the analyzed publications that are important for considering the problem. Despite the fact that the reviewed article will undoubtedly find its interested reader, it could be improved on several points. So, first of all, it is advisable to structure the text, there is neither an introduction nor a conclusion in the article, which does not contribute to the perception of its content. An even more important disadvantage of the presented article is that the author, in essence, does not address the issue of the impact that the effects of human presence in the virtual world described by him have on social relations. He talks about the psychological effects, about the dangers that can arise for a person diving into the world of the Internet, but what happens to social relations, how does the system of social connections change? In short, it would be desirable to more clearly present the socio-philosophical components of the effects of human interaction with the virtual environment described in the article. It is also necessary to correct some technical errors, for example, there are extra commas in the text ("denotes, the term he introduced, "bubble filter"", "selection algorithms narrow the field", etc.). Despite the comments made, it seems correct not to send the article for revision, but to recommend it for publication, since the obvious competence the author hopes that the necessary amendments can be made in a working manner.