Translate this page:
Please select your language to translate the article


You can just close the window to don't translate
Library
Your profile

Back to contents

Philosophy and Culture
Reference:

Technosocial autonomy: the synthesis of Gilbert Simondon's processuality and Niklas Luhmann's systems theory

Sayapin Vladislav Olegovich

ORCID: 0000-0002-6588-9192

PhD in Philosophy

Associate Professor; Department of History and Philosophy; Tambov State University named after G.R. Derzhavin

392000, Russia, Tambov region, Tambov, Internatsionalnaya str., 33

vlad2015@yandex.ru
Other publications by this author
 

 
Kiryushin Alexey Nikolaevich

ORCID: 0000-0001-8614-8353

Doctor of Philosophy

Associate Professor; Department of Aviation Tactics; Military Training and Research Center of the Air Force 'Air Force Academy named after Professor N.E. Zhukovsky and Yu.A. Gagarin' (Voronezh)

54A Starye Bolshevikov str., Voronezh, 394064, Russia

elrisha_@rambler.ru

DOI:

10.7256/2454-0757.2025.5.74324

EDN:

TEWXRO

Received:

05-05-2025


Published:

31-05-2025


Abstract: The modern world is undergoing a radical transformation driven by digital technologies, which increasingly exhibit traits of autonomy: algorithms governing social networks, neural networks generating content, or robotic systems making decisions – all of them function according to an internal logic that cannot be reduced to human intentions. This growing independence of technology poses fundamental questions for society: who or what controls the techno-social reality? How can we preserve human agency in a world where technology gains its own "intellect"? In this regard, two approaches – the procedural philosophy of technology by Gilbert Simondon and the systems theory of Niklas Luhmann – remain insufficiently integrated and researched, despite their complementary potential. Simondon emphasizes individuation and transduction, revealing technologies as dynamic processes interwoven with the development of humanity and society. Luhmann, describing technology through the lens of autopoietic, self-referential systems, demonstrates their capacity for self-organization and operational closure. The methodological foundation of the article is comparative analysis and theoretical synthesis. We compare the key concepts of Simondon and Luhmann, identifying points of intersection and contradiction. Digital platforms are examined as examples where the autonomy of algorithms (Luhmann) and their role in shaping user practices (Simondon) are most vividly manifested. The goal of this article is to propose a synthesis of these approaches, bridging the gap between procedural and systemic understanding of techno-social autonomy. We argue that the integration of Simondon's and Luhmann's ideas allows for: 1) explaining how technologies simultaneously evolve through interaction with society (Simondon) and function as closed systems (Luhmann); 2) revealing the dialectic of human and technological agency in the context of digitalization; 3) creating a basis for ethical reflection on autonomous technologies, avoiding the extremes of techno-optimism and determinism. The scientific novelty of the work lies in overcoming disciplinary boundaries: Simondon's philosophical depth enriches Luhmann's structural analysis, while systems theory lends sociological specificity to processuality. This synthesis paves the way for a more holistic understanding of techno-social reality not as a confrontation between humans and machines, but as a complex symbiosis, where the autonomy of technology becomes both a condition and a challenge for a new stage of social evolution.


Keywords:

technosocial, technology, systems theory, autonomy, autopoiesis, self-organization, individuation, concretization, transindividual, transduction

This article is automatically translated.

Introduction

The first quarter of the 21st century witnessed an unprecedented growth in digital technology autonomy, a process in which algorithms, artificial intelligence, and digital platforms increasingly operate according to internal logic independent of direct human control. In other words, digital technologies equipped with artificial intelligence have ceased to be passive tools. Today, they not only shape our daily consumer habits and make decisions on the battlefield, but they have also become active agents capable of learning, adapting, and influencing social structures such as economics, politics, culture, law, and more. Therefore, the phenomenon of "technosocial autonomy" cannot be understood within the framework of outdated paradigms that reduce technology to passive tools. Based on the system theory of Niklas Luhmann (1927-1998)[1,2,3,4,5,6,7,8] and the philosophy of technology of Gilbert Simondon (1924-1989)[9,10,11,12,13,14], we offer a new perspective on how the autonomy of technology rebuilds social structures, creating hybrid forms of reality where man and machine coexist in dynamic interaction.

Niklas Luhmann, in his systems theory, describes society as a set of autopoietic systems (self-referential and operationally closed entities) that reproduce themselves through internal recursive communications. In this logic, technologies can be considered as a subsystem that functions according to its own binary code (for example, functionality/failure) and is structurally related to economics, law or politics. The autonomy of technologies here is not a metaphor, but a consequence of their ability to self-organize, where even the algorithms of social networks act as "black boxes" and whose solutions are opaque to external observers. Gilbert Simondon, on the contrary, emphasizes the procedural nature of technology. For him, technologies are not static objects, but participants in individuation, a process of formation in which a person and a technical object evolve together. According to Simondon, autonomy does not arise in isolation, but through transduction – overcoming contradictions between technical, social and individual dimensions. For example, the development of artificial intelligence is not just the creation of tools, but the formation of new relationships that redefine human identity itself. At the same time, the growth of technological autonomy is radically transforming traditional institutions: economics, politics, culture, religion, law, etc. For example, high-frequency trading algorithms (like Luhmann's autopoietic systems) reshape markets, and platforms like Uber, embodying Simondonian transduction, blur the boundaries between labor, capital, and digital infrastructure. Another example is that social networks controlled by artificial intelligence become autonomous actors whose decisions influence elections, but their internal logic remains beyond the control of democratic institutions. At the same time, as Simondon shows, users and algorithms jointly form new forms of political mobilization through the process of individuation.

Thus, the autonomy of technologies is a transductive process: they simultaneously grow out of social needs and begin to dictate their conditions, creating recursive feedback loops. The autonomy of technology does not destroy social structures, but transforms them into technosocial ecosystems through "progressive ascent without denial", where people, algorithms and institutions coexist in complex interaction. Understanding this transformation is not an academic abstraction, but a necessary condition for the survival of democracy, justice, and human dignity in the digital age.

Luhmann's System Theory: Technologies as autonomous autopoietic systems

Obviously, Luhmann himself does not consider technology to be one of the main functional systems of our time. In his opinion, they do not play such an important role as law, economics, politics, culture or other more well-known functional systems. However, given the general nature of Luhmann's systemic model of society and the specifics of technology, we can nevertheless imagine the applicability of this theory to the philosophy of technology without having to reconstruct the (non-existent) Luhmann theory of technological autonomy. Our goal in this case is to deepen the understanding of autonomy, going beyond the original ideas of the author. This is confirmed, for example, by certain statements by Luhmann, in which, speaking about the Internet and other modern communication technologies, he notes that: "... a far-reaching consequence of the evolution of information dissemination technologies and related media is operations subject to spatial integration. By integration, I mean limiting the degree of freedom of systems"[2, p.188]. Technology works in such a way that it excludes from its operations: "... a really living individual, a subject generating meaning" [7, p.225]. In this regard, it can be argued that some technologies have the ability to choose, playing an active role in building a technosocial reality. Thus, according to Luhmann, the functional system of mass media, in addition to interpreting reality, can be formulated as making sense of reality not only for other functional systems, but also for mental systems (humans)[5]. As a result, the concept of "autonomy" can be defined as the ability to self-determine an operationally closed and self-referential system.

It can be noted that despite the fact that Luhmann characterizes the "mass media" as a truly innovative system, he does not consider the technology as a whole to be autonomous. Luhmann notes the growing dependence of society's functioning on technologies such as information and communication technologies. However, he believes that concepts such as "technosocial society" and the like are exaggerated. In his opinion, a modern functionally differentiated society could not exist without technology, but we should not equate these two concepts [2, p.321]. Some technologies, according to Luhmann, do point to system autonomy. In particular, writing and printing had a huge impact on accelerating social evolution. Therefore, Luhmann believes that computers do the same thing. As noted by German social researcher Dirk Becker: "Computers are an 'alternative' to the structural connection between communication and consciousness, although one that increases rather than reduces the internal complexity associated with communication"[15, p.30]. Therefore, modernity is increasingly built around the management, distribution and minimization of risks arising from unforeseen consequences. That is why the most systematic presentation of Luhmann's views on technological autonomy is based around a "risk society", which is the result of accelerated technological development. Here, the technology is essentially related to the problem of controllability, is instrumental and assumes three separate characteristics: 1) manageability of processes; 2) suitability of resources for planning; 3) the possibility of error localization [6,p.88].

Thus, according to Luhmann, technology should be understood as all procedures that lead to the "causal closeness" of the operational area[6, p.87]. That is, everything that allows social actors to reduce complexity can be considered technology. For example, scientific concepts in the forms of technoscience can be considered as self-producing technologies (technology confirms theories and generates ideas that, when implemented, become the material for the production of new technology). When we talk about the concept of "communication," we use technology that helps us classify various phenomena according to the principle of "meaning/lack of meaning." At the same time, communication itself is not just a tool, but a recursively closed autopoietic system that can be determined only by its own structures, and not by states of consciousness[4, p.264]. Therefore, within the framework of Luhmann's concept, a simple tool cannot be considered an autopoietic entity. Tools are trivial machines, objects that are not capable of moving or creating something on their own. Autopoiesis, in Luhmann's understanding, is "non–trivial machines" capable of creating and maintaining their own boundaries. In other words, autopoiesis is the emergent ability of complex self–enclosed systems to reproduce themselves[16]. In this regard, autopoiesis: "... not only creates its own structures, like some computers capable of developing programs on their own, but is also autonomous at the operational level. He cannot import any operations from his environment. <..Such operational isolation is just another way to formulate the statement that an autopoietic system generates operations through a network of its own operations that it needs to generate operations"[1,p.77].

Along with this, Heinz von Foerster, one of the founders of second-order cybernetics [17, p.10], believed that a non-trivial machine creates and maintains its own internal states, "obeying its inner voice." That is, developing the idea of N. Wiener (creating a machine that surpasses its creator in terms of intelligence)[18], H. von Foerster raised the system-cybernetic approach to a qualitatively new level, considering circularity as a manifestation of self-reference. Such systems are not subject to an external goal set by the managing entity, but are "self-directed", focused on their own activities and operationally closed. In this context, Luhmann argues that a certain form of technology has emerged – "high technology", which differs from the standard format of instrumentality and triviality. Due to its large-scale consequences, high technologies go beyond the scope of "technical regulation of technologies"[6, p.89]. Therefore, this category includes all technologies that, on the one hand, lead to opaque causal relationships, and on the other hand, create unpredictable technologies that go beyond purely instrumental use. In the case of such procedures or complexes, the risks become apparent only during use. As Luhmann believed, high technologies violate the defining form – the boundary between embedded and excluded causal relationships[6, p.90]. The emergence of this particular form of technology generates not only epistemological or existential uncertainty, but also a new kind of structural dependence. For example, a simple example of a structural dependency along the way is the Qwerty keyboard layout. The Qwerty layout was originally developed in 1867-1871 by American inventor C.L. Scholes for mechanical typewriters to reduce the likelihood of keys jamming by placing frequently used letter pairs at great distances from each other. As a result, the layout has not been optimized to improve typing speed or efficiency. That is why the management of unforeseen risks arising from the use of high technologies is possible only with the help of additional technologies. Luhmann puts forward what can be called the thesis of "moderate autonomy." Technological accidents and disruptions can be dealt with only by using newer technologies. This agrees well with X's observations. von Foerster on nontrivial machines: "A nontrivial machine can change its internal processes in such a way that its drift leads to an expansion of diversity" [19, p.86].

From here, there are many examples that demonstrate this type of escalation logic. For example, the cascading risks associated with nuclear energy, or the numerous serious environmental risks associated with oil and gas exploration. In each case, the solution is always technical in nature. Fault-tolerant or corrective technologies are used to eliminate or at least mitigate the consequences. Since they have unpredictable capabilities beyond a certain threshold, high technologies acquire a kind of autonomy precisely because of this unpredictability. In this case, X. von Foerster notes that when non-trivial tendencies appear (the car won't start, etc.), we call a trivialization specialist who corrects the situation[17, p.12]. At the same time, trivialization refers to ways to reduce the complexity introduced by an unexpectedly non-trivial technology. Additional technologies, control methods, or a trivial machine inside a machine are required, which can only be trivialized to a limited extent in order to stabilize the overall functionality [6, p.93]. Instead of reducing the overall social complexity, high technology leads to a further increase in complexity. As Luhmann warns, "... the machine can be rebuilt in unexpected ways" [6, p.93].

It should be emphasized that despite the fact that Luhmann himself considers technology as part of the "environment" of society, and not as an autonomous social subsystem, nevertheless, many recent scientific examples convincingly prove the opposite. Technology as a whole or one of its specific manifestations can be considered as an autopoietic system in the sense in which Luhmann used it. Thus, according to the hypothesis of social researcher A. Reichel, today technology has become a functional system suitable for creating differences that have real significance for society as a whole. "Technology,– Reichel writes, –establishes a cyclical and recursive relationship between itself and the environment, the physical world, society and people" [20, p.106]. It is more than just a tool subordinated to human intentions. According to Reichel, Luhmann's contribution can be summed up in the fact that the sociologist places people in the environment, and society productively decenerates the anthropocentrism that still prevails in many social sciences. Instead of a tool that passively serves people, technology is a complex system consisting of technological artifacts, the physical and social environment, and human users[20, p.105]. Luhmann's other important contribution is the recognition that autopoiesis can exist in the case of inorganic and bioorganic systems in addition to living systems. Neither life nor consciousness are necessary factors when it comes to postulating self-awareness and the ability to reproduce oneself. In Luhmann's case, society is built from communications generated by inorganic functional systems, not by individuals. Each functional system has its own media and binary code, with which it is able to transform the infinite complexity of reality into information. According to Reichel's thesis, the specific binary code of technology is the "work/failure" binary, and its carrier is "efficiency" [20, p.109-111].

Indeed, this understanding seems to apply to many, if not all existing technologies. Of course, autonomy does not mean the complete isolation of one functional system from another. Operational closure implies that there is also "interpenetration" (structural coupling) or joint development between different functional systems. For example, Reichel mentions tools that mimic the shape of a human hand[20, p.112]. Here, you can also use the phrase "resonance" to refer to how different functional systems interact and complicate each other's communications. Although Luhmann himself uses this phrase only in the "Introduction to Systems Theory" lecture series to describe the ability of a social system to sense changes in its environment and adjust its operations and structures accordingly [21, p.2499]. Of course, social systems also function as environments for each other. In addition, for example, Bruno Latour's theory of actor-network interactions is one of the sociologically plausible examples of how the concept of agency can be extended to non-human or inanimate entities[22].

However, in our opinion, technology as a whole cannot be completely reduced to a tool for achieving purely human, social or other non-technological goals. Even if some technical artifacts are obviously trivial, the technological system as a whole can be considered a non-trivial machine. Following this path, even if one denies the autopoietic nature of technology in general, more "moderate versions" of the autonomy thesis are also possible. For example, two main theses have been formulated in the scientific literature in connection with technology and the Internet: on homogenization and on diversification. While the former interprets cultural globalization as a trend towards uniformity, the latter rather highlights how the Internet is disrupting the public sphere. No matter what we think about these ideas (nothing forces us to accept any of them as an axiom), since both of the above positions are inherently anthropocentric, since they primarily focus on people's habits of consuming "media content". The same applies to the so-called "techno-optimistic" or "techno-pessimistic" scenarios. They are also usually viewed from the perspective of people, rather than the technology itself. Whether it is the optimistic socio-communicative utopia of M. Castells[23] or the pessimistic "infocracy" of J. van Dyck[24]. The limitation of such models is that they are based on human meanings, as well as on relative advantages or disadvantages affecting humanity. Therefore, user-oriented models of cyberspace cannot fully take into account digital technologies as autonomous systems. As a result, we can identify four "epistemological obstacles" that need to be eliminated before we can think about society in a technosocial way.: 1) society consists of specific (human) individuals and their actions; 2) social integration exists on the basis of consensus; 3) societies form regional units; 4) societies can be observed from an external point of view. In this regard, Luhmann's theory of "social systems" is useful because it contradicts all four anthropocentric assumptions of the social sciences. For Luhmann, society consists of communications and stimuli transmitted by systems. In addition, social differentiation takes precedence over consensus. Society is global (functional systems are organized as a global society) and there is no critical point of view external to society.

Of course, any autopoietic system has the ability to self-organize and stimulate, as well as selectively connect to its own environment. The matter is complicated by the fact that cyberspace, unlike the "field" systems of social functions defined by Luhmann, does not reduce complexity, but, like "high technology", increases it. Cyberspace is a system that is as complex as its surroundings. Cyberspace is only a partially autopoietic closed system: it is autonomous, but at the same time it exists as an environment for society as a whole, as a real "meta-system". Unlike functionally oriented social systems, cyberspace is not limited by any structural or spatiotemporal factors in stimulating its own complexity, which allows it to further reveal its inner complexity. While the communications of other social systems are functionally oriented (economics transmits only economic data, law deals only with legal matters, etc.), cyberspace is able to cover all forms of communication. Currently, there is a politically and morally motivated demand for comprehensive legal and political regulation of the Internet. But these politically motivated efforts ignore the fact that cyberspace is structurally disconnected from its social environment. There is no necessary structural connection between the Internet and other sectors of society, with the exception of its users and the algorithms of social networks. Therefore, all these factors can be considered as confirmation that the Internet, as an extreme form of technology, is indeed a separate social system that begins to transform into an independently organized system in the first quarter of the 21st century.

However, one could argue that cyberspace depends on the material, infrastructure, and support from other social systems. In addition, this in no way means that operations are not controlled by these systems. Cyberspace is both dependent on the support of the environment and functionally independent. This point of view is consistent with Luhmann's theory (here we mean the paradox of simultaneous total dependence and total autonomy). Luhmann does not consider this a contradiction [1, p.201]. For him, autonomy and dependence are two sides of the same coin: systems are autonomous in their operational logic, but depend on structural connections with other systems and the environment. It follows that, because of its radical autonomy, cyberspace is presumably also more complex than society as a whole. Technosocial reality does not need a choice among the components, it can take everything into itself, turning any content into binary encoded information. In the case of cyberspace, the highest level of cyclicity is observed in the field of computer storage networks, which act as its hardware basis. Such equipment must be relatively isolated from the environment in order to remain functional: for example, servers cannot operate in dusty or dirty conditions. Unlike cyberspace as a whole, its infrastructure approaches the ideal of a functionally closed autopoietic system. Therefore, in this reading, cyberspace is a new ecology, an artificial environment that functions as a shell of society through which all communication is carried out. Moreover, according to the so-called "gap thesis," technological development has already exceeded the level that would allow for deliberate planning or human control. The technology system has acquired such a degree of complexity that it cannot be subordinated to any political, economic or subjective goals. While traditional functional systems reduce complexity, the technology system does not demonstrate this capability. Therefore, in the second part of the article, it is possible to productively supplement the emerging Luhmann research paradigm of technology as an autonomous autopoietic system with Simondon's ideas not only about the autonomy of technology, but also about understanding technology as a process of active formation.

Simondon's Philosophy of Technology: Autonomy as a process of becoming

Simondon was one of the first thinkers of the twentieth century who attributed an autonomous entity to technology, contrasting it with the human individual. His main work "On the mode of Existence of technical Objects" (1958) is an attempt to bridge the gap between culture and technology, characteristic of the Western tradition, and to establish technical objects as full-fledged entities with their own logic of development and ontological status. Simondon criticizes approaches that either reduce technology to an instrumental function (as in utilitarianism) or view it as a threat to human autonomy (as in some areas of existentialism). Instead, he suggests a phenomenological analysis of technology, where a technical object is understood as the result of a process of individuation, a dynamic formation in which matter, form and environment interact, generating new levels of organization. At the same time, this triadic process, according to Simondon, manifests itself for technical objects as "concretization"[9, p.23] and underlies their evolution from elementary tools to self-organizing technosocial systems. In other words, such an ontogenesis of technical existence appears as a movement, a course, a process obeying a "triadic rhythm": "... a technical individual is the middle part of an ascending series in which the element is the starting point and the aggregate is the completion" [25, p.456].

In this case, an important contribution to Simondon's understanding of the evolution of technical objects is the distinction between the abstract (from Latin abstraho – "to pull away, distract, tear off") and the concrete (from Latin. concresco – "to coalesce, condense, thicken") by a technical object. An abstract object, in his opinion, exists only as a scheme or idea subordinated to an external goal (for example, a steam engine of the early stage of the industrial revolution, dependent on human control). A specific object is a system that has achieved internal consistency, where each part functions in synergy with others, and technical individuality acquires autonomy, materiality and visibility. An example is a modern internal combustion engine that integrates cooling, lubrication, and energy transfer into a single self-regulating mechanism. In other words, from the 1910 engine, which was not equipped with fins, to the 1956 engine, which, on the contrary, was equipped with fins, there is a process of evolution, which Simondon calls concretization (or an increase in internal functional overdetermination). That is, specific technical facilities, complex self-regulating systems that are integrated into a related technical and geographical environment and have internal consistency [9, p.45-47]. In this regard, the specification of a technical object, leading this object to progress, occurs through the internal adaptation of its elements, which begin to function as a single whole, rather than as a set of independent parts. It follows from this that this "transductive process" manifests itself in increasing consistency between the elements, reducing their contradictions, that is, in strengthening relationships[26].

As a result, a technical object acquires more and more autonomy, becomes integral, and it can be considered as an independent entity (more and more "individual"), rather than as a simple collection of individual details. This autonomous process, according to Simondon: "... is not dialectical, because the individual does not negate the element, and the totality does not negate the individual; or rather, we can say that it is a dialectical rhythm without subsequent negations; negation exists only at the moment of transition from one phase to another; it does not exist within each phase, when they move from an element to an individual and from an individual to an aggregate" [25, pp.456-457]. In essence, Simondon postulates two ways of developing a technical object: "transductive" and "dialectical". With dialectical development, a new technical object is created, as if to the detriment of the old one, and with the transductive one, the former object remains and its positive transformation (progressive ascent without negation) continues. In this regard, Simondon believes that the autonomy of a technical object does not arise in isolation, but through "transduction" [9, p.62]. Technology forms new relationships (or a trajectory of formation), combining the social, individual and material in the dynamics of technical formation. "Since the relations," writes Simondon, "existing at the level of technicality between two technical objects are both horizontal and vertical, cognition operating with genera and species is inadequate: we will try to show in what sense the relation between technical objects is transductive" [9, p.20].

As a result, the concept of transduction (from Latin. transduco – "translate, move") For Simondon, this is not just a metaphor, but a fundamental principle that explains how technical systems overcome crises and reach new levels of organization. Thus, a striking example of a transductive process is the crystallization of a physical solution, where the initial "seed" of the crystal triggers a chain reaction of structuring the medium and in which an abstract disordered system is transformed into an ordered and stable form. In this case, a crystal that grows from a very small embryo and expands in all directions in its oversaturated mother fluid. In other words, crystallization is the simplest image of a transductive operation: each already formed molecular layer serves as an organizing principle for the layer that is currently being formed. As a result, a reinforcing reticular (network-like) structure is obtained [13, p.45]. In other words, a solution in a state of metastability overcomes internal stresses, forming a crystal lattice. This process reflects the transition from the "abstract" (a solution with a chaotic structure) to the "concrete" (a crystal with a clear organization), which corresponds to the Simondonian concept of concretization of technical objects. Just as a crystal arises from the interaction of solution molecules, technical systems evolve through the integration of components, eliminating redundancy and enhancing functional interdependence. For example, the development of the steam engine from elementary forms to self-regulating systems. Each phase of the engine (for example, the introduction of the Watt regulator) is a transductive jump that resolves the contradictions between thermal energy and mechanical control [9, p.45-50]. In addition, the key aspect here is the role of the associated environment, the dynamic context in which the object is formed. In the case of crystallization, the medium (solution) not only surrounds the crystal, but actively participates in its formation, determining the conditions of temperature, concentration and pressure. Similarly, technical objects, according to Simondon, co-evolve with their environment, be it social, economic or material factors. This example not only demonstrates the universality of the Simondonian theory, but also highlights its interdisciplinary value, linking the philosophy of technology with natural science processes. Crystallization is becoming a metaphor for how technologies, overcoming phases of instability, acquire a "technical identity", which is especially relevant in the era of network systems and artificial intelligence, where the autonomy of objects reaches new scales.

Simondon identifies three levels of components in technical development: an element, an individual, and a set (ensembles of technical objects)[9,p.17]. In addition, he emphasizes that the highest stage of technology development is the formation of technical and geographical environments where objects are combined into networks (ensembles), creating a new ecology of interactions with humans. The human–machine relationship is a two-way process in which machines are modulators of human activity, and, in turn, humans are like the conductors of a machine orchestra. Man is the constant coordinator and inventor of the machines around him [9, p.11-12]. He is one of the machines that work with him" and acts as an interpreter of the "man–machine" dynamics, since he lives in a society of technical beings of which he is a part. For example, an electrical network is not just a collection of wires and generators, but an environment that structures the space and time of human activity. These networks have both autonomy or, in other words, the ability to self-organize (for example, an electric grid automatically redistributes loads) and transindividuality, namely, they are intermediaries, connect people and objects into collective systems and support the processes of individuation. According to social researcher B. Stigler, a radical shift has taken place since the moment designated as the industrial revolution: we no longer see the gradual improvement of individual technologies and devices, as it used to be. Instead, there is a rapid development of the entire technological system as a whole. This progress is reflected in the convergence and coherence of different technologies and industries, where changes in one area automatically lead to changes in others. Such a dynamic system has a profound impact on a society experiencing difficulties in adapting to a rapidly changing technological landscape. This process can be considered both as a threat that destroys traditional cultures and lifestyles, and as a chance for their radical renewal [27, p.44].

Simondon believes that such an evolution of technology does not occur through the accumulation of inventions, but through the resolution of contradictions between the elements of the system[9, p.58]. It follows that a technical object is not a static thing, but an event that occurs in a "pre-individual field" of potentials. This "pre-individual field" is metastable and contains the potential of multiple paths of individuation [13, p.327]. The technical object is not "completed" and always retains the remainder of the pre-individual, which allows for further transformations. The concept of "pre-individual being" does not define any primary essence or substance as such, but rather it is a condition of being, namely: "... an entity that is more than unity and more than identity, and which has not yet turned into multiplicity"[13, p.32]. The pre–individual is not the "what", but the "how" of becoming, the field of tension preceding the division into subject and object. It can be noted that the inspiration for Simondon's hypothesis about the state of "pre-individual being" comes mainly from physics. In particular, from understanding the thermodynamic concept of "metastability", which describes a state that is neither completely stable nor unstable, but is located somewhere in between and contains enough potential to produce a sudden change leading to a new, equally metastable structure. However, although Simondon was inspired by the idea of the pre-individual primarily from the field of sciences, he often described this concept in explicitly metaphysical terms, sometimes comparing it with the "apeiron" described by Anaximander.

The individuation of technology occurs through phases: from initial instability (for example, a prototype) to a stable form while maintaining the potential for further specification [13, p.31-35]. This process is similar to biological development, where the body adapts to the environment. But in the case of technology, the environment also includes human practices and cultural codes. In other words, Simondon believes that technical objects in the process of concretization approach the way of existence of natural beings (and here living beings should be understood). Therefore, a particular object has repeated causal relationships with what it calls an associated environment, by analogy with the environment of a living being. He goes so far as to talk about "natural technical evolution." Consequently, for Simondon, as, for example, for the anthropologist A. Leroy-Guran[28], the object of technological knowledge is not such an isolated and given object in direct experience. This is the distribution of functions between the various structures of an object, its functional systematics, and the process that gave rise to this systematics by transforming the way various technical operations are coordinated. Thus, the object of technology is also a process of evolution, which is not a historical process, but a process regulated by the laws of transformation of an operational, functional nature. In other words, the concept of evolution does not reduce the purely technical domain of objects to the domain of living beings. It is not aimed at naturalizing techniques, pulling them out of the realm of human meanings. On the contrary, it makes it possible to transform this area of methods into an area of objectivity, separate from others. Technical objects are autonomous, irreducible, and made accessible by special knowledge, which is technological knowledge. Therefore, according to Simondon, by adopting an evolutionary view of technical objects, we give ourselves access to their rich content of human meanings.

According to Simondon, a technical object in which structures are multifunctional increasingly resembles an organism, as opposed to a simple unit. This analogy is purely operational in nature. It is she who gives meaning to the idea that an element carrying a technical component can be compared with what an organ is in a living body[9, p.65]? A comparison that is immediately limited to a functional value. Simondon does clarify that a technical element, unlike a biological organ, is an element that carries not only a technical component, but it is removable and portable from one technical kit to another technical kit. Simondon adheres to this idea based on a long tradition of thinking. We find this, in particular, in the French engineer and philosopher J. Lafitte[29], who also explains that there is an analogy between a technical element and an organ, as well as how Simondon is interested in the integration of machines into technosocial complexes. For example, in chapter III of his work, he analyzes how industrial systems of the 19th century created "technosocial organisms" where human labor is integrated into mechanized processes [29, pp.112-115].

According to Lafitte, machines are organized bodies created by people whose internal organization demonstrates sufficient plasticity to exhibit various communication properties. A machine is a general term that encompasses an extensive set of mechanisms and devices [29, p.28]. Improvements, devices, tools, toys, arc–shaped architectural structures of all bodies are all machines. In addition, Lafite argues that individual machine components (for example, gears, engines) function like organs of living beings, performing specialized roles within an integrated system [29, p.30]. Each element, like a biological organ, depends on other parts of the system and the environment into which it is integrated. He emphasizes that technical objects evolve through adaptation to the environment, and their elements evolve towards greater autonomy and interdependence, just as the organs of living organisms are improved in the process of biological evolution. However, the interest of the analogy lies not in the direct and simple equationof "technical" with "biological". We are not talking about naturalization of the technical field. According to Lafite, a living body and a machine consist of elementary cells, which are organically indecomposable elements and possess unique functional properties. In the history of technology, as in the history of living forms, the evolutionary process sometimes consists in differentiating functions that separate into special organs, which leads to a more complex organization, sometimes in the degradation, disappearance and displacement of certain organs, which leads to corresponding changes in functions.

Thus, for Simondon, the issue of technology is a profound issue of ontological proportions. However, his proposed ontology is relational and dynamic throughout the entire transductive path, giving priority to the processes of becoming over static being. Simondon's ontology is the ontology of emergence or, in his own terms, the philosophy of individuation. In accordance with this, he considers technology as a way of existence that is subject to genesis. Further, he considers technology as an intermediary in terms of its effectiveness or operational functioning. Through this mediation, a technical object acquires autonomy in the sense that it acquires functions that go beyond the expectations and goals that were inherent in its invention. Simondon warns against anthropocentrism, which not only absolutizes man or opposes him to technology, but also denies the autonomy of technical objects. Instead, he proposes a non-humanistic ethics based on the recognition of the interdependence of all forms of individuation: physical, biological, mental, collective and technical. In other words, Simondon sees anthropocentrism as a metaphysical error that distorts our perception of technology. Instead, he suggests a "technology–centered" approach, where man and technology are equal participants in a single process of becoming. This does not detract from the human, but it allows us to avoid blindness in an era when technology defines our technosocial reality. That is why technology, in his opinion, should not be evaluated by utilitarian or moral criteria, but by its ability to enhance vital potentials[13, p.245]. For example, automation, properly understood, frees a person from routine work, allowing him to focus on creative tasks[9, p.132].

Conclusion

We must admit that there is not a single corner of the Earth that is not affected by digital technologies to some extent today. From the division of labor to the stratification of access to resources, digital technologies permeate all aspects of our lives. It follows that technosocial autonomy in the digital age is a concept that reflects the ability of a person, community, or institution to maintain independence, freedom of choice, and control over their activities in the face of the deep penetration of digital technologies into all spheres of life. It implies a balance between using technology to empower and protecting against its negative impact on individual and collective rights, privacy, and self-determination. Therefore, the synthesis of the ideas of J. Simondon and N. Luhmann opens up new horizons for understanding technosocial autonomy in the digital age. The Simondonian concept of processualism, where the autonomy of technical objects arises through the continuous formation and resolution of contradictions in their associated environment, is complemented by Luhmann's understanding of systemic autonomy as the operational isolation and self-reference of social systems. Together, they form a framework in which technosocial systems appear both dynamic and structured, capable of evolution and identity preservation. Simondon's emphasis on individuation emphasizes that autonomy is not a static state, but a process in which technical objects (from algorithms to robots) acquire "technical individuality" through interaction with material and social conditions. Luhmann's autopoietic systems, on the contrary, demonstrate how social structures (law, economics, culture, etc.) maintain autonomy through internal communicative binary codes, filtering external influences, but not ignoring them.

The synthesis of these two approaches allows us to see how technosocial systems (for example, digital platforms) simultaneously evolve according to the laws of processality (adapting to data and users) and preserve system integrity (through algorithmic self-regulation). For Simondon, the autonomy of a technical facility is impossible without a connected environment that both limits and fuels its development. Luhmann's systems, while remaining operationally closed, "translate" external impulses into their internal language (for example, the economy turns environmental crises into financial risks). In a technosocial context, this means that the autonomy of artificial intelligence or blockchain networks depends not only on their algorithmic logic, but also on their ability to integrate into social communications without losing their specificity. If the autonomy of technosocial systems is based on their self-organization, then how can we ensure the accountability of algorithms or prevent their exploitation by government structures? According to Simondon, the process of technical evolution requires resources, and Luhmann systems are prone to endless expansion. This raises the question of the limits of the growth of technosocial autonomy in the context of the climate crisis. Technosocial autonomy, viewed through the prism of Simondon and Luhmann's ideas, turns out to be not a paradox, but a dialectical unity. It manifests itself in the ability of systems to self-renew and self-preserve (operational isolation). This synthesis not only explains the sustainability of digital infrastructures, but also offers tools for their critical analysis. In a world where technology is increasingly becoming an actor of social change, this approach avoids both technodeterminism and social reductionism, paving the way for a more balanced dialogue between man, machine and society. Thus, technosocial autonomy requires not the abandonment of technology, but the development of critical thinking, legal mechanisms (for example, regulation of artificial intelligence) and ethical standards. This is a dynamic process where technology should serve people, not subjugate them. The key issue of the digital age: How can we preserve humanity in a world where algorithms are increasingly making decisions for us?

References
1. Luhmann, N. (2013). Introduction to systems theory. Polity.
2. Luhmann, N. (2012). Theory of society. Volume I. Stanford University Press.
3. Luhmann, N. (2013). Theory of society. Volume II. Stanford University Press.
4. Luhmann, N. (1996). On the scientific context of the concept of communication. Social Science Information, 35(2), 257-267.
5. Luhmann, N. (2000). The reality of the mass media. Stanford University Press.
6. Luhmann, N. (1993). Risk: A sociological theory. De Gruyter.
7. Luhmann, N. (1990). Technology, environment and social risk: A systems perspective. Industrial Crisis Quarterly, 4, 223-231. https://doi.org/10.1177/108602669000400305
8. Luhmann, N. (1995). Social systems. Stanford University Press.
9. Simondon, G. (1958). Du mode d'existence des objets techniques. Aubier.
10. Simondon, G. (1964). L'individu et sa genèse physico-biologique. Presses universitaires de France.
11. Simondon, G. (1989). L'individuation psychique et collective. Aubier.
12. Simondon, G. (1994). Gilbert Simondon: une pensée de l'individuation et de la technique. Albin Michel.
13. Simondon, G. (2005). L'individuation à la lumière des notions de forme et d'information. Millon.
14. Simondon, G. (2015). Communication et information. PUF.
15. Baecker, D. (2006). Niklas Luhmann and the society of the computer. Cybernetics and Human Knowing, 13, 25-40.
16. Maturana, H. R., & Varela, F. J. (2001). Tree of knowledge. Progress Tradition.
17. Foerster, H. (1984). Principles of self-organization-in a socio-managerial context. In Self-organization and management of social systems: Insights, promises, doubts, and questions (pp. 2-24). Springer.
18. Wiener, N. (1958). The machine that smarter than its creator. In Cybernetics: Or control and communication in the animal and the machine. Soviet Radio.
19. Foerster, H. (2013). The beginning of heaven and earth has no name: Seven days with second-order cybernetics. Fordham University Press.
20. Reichel, A. (2011). Technology as system. Towards an autopoietic theory of technology. International Journal of Innovation and Sustainable Development, 5(2-3), 105-118.
21. Clark, C. (2020). Resonanzfähigkeit: Resonance capability in Luhmannian systems theory. Kybernetes, 49(10), 2493-2507. https://doi.org/10.1108/K-07-2019-0490
22. Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford University Press.
23. Castells, M. (2002). The internet galaxy: Reflections on the internet, business, and society. Oxford University Press.
24. Van Dijk, J. (2020). The network society. SAGE Publications.
25. Simondon, G. (2016). Sur la philosophie (1950-1980). PUF.
26. Grigorova, Y. V., & Timashov, K. N. (2024). Dialectics and transduction in the philosophy of Gilbert Simondon. Philosophical Journal, 17(3), 76-90. https://doi.org/10.21146/2072-0726-2024-17-3-76-90
27. Stiegler, B. (2018). La technique et le temps. Fayard.
28. Leroi-Gourhan, A. (2000). L'Homme et la matière. Albin Michel.
29. Lafitte, J. (1972). Reflexions sur la science des machines. J. Vrin.

Peer Review

Peer reviewers' evaluations remain confidential and are not disclosed to the public. Only external reviews, authorized for publication by the article's author(s), are made public. Typically, these final reviews are conducted after the manuscript's revision. Adhering to our double-blind review policy, the reviewer's identity is kept confidential.
The list of publisher reviewers can be found here.

The subject of this article is the phenomenon of technosocial autonomy in the digital age. The author analyzes this phenomenon through the synthesis of the philosophical concepts of Gilbert Simondon (processality of technology, individuation, transduction) and the system theory of Niklas Luhmann (autopoiesis, operational closure, systemic differentiation). The research aims to form a holistic understanding of the interaction of technology and social structures in the context of the growing autonomy of digital technologies. Research methodology Methodologically, the work is based on a comparative analysis of two philosophical approaches and their potential synthesis. The author uses the hermeneutic method to interpret the texts of Simondon and Luhmann, extrapolating their ideas to modern technological realities. It is noteworthy that the author recognizes the limitations of the direct application of Luhmann's theory to technology (since Luhmann himself did not consider technology as a separate functional system) and suggests a productive development of his concepts in the context of the digital environment. The methodological approach also includes elements of system analysis when considering technologies as autopoietic systems and a philosophical and historical method when analyzing the evolution of technical objects in Simondon's concept. The relevance of the research is beyond doubt and is justified by the author at the beginning of the article. In the first quarter of the 21st century, there has been an unprecedented increase in the autonomy of digital technologies, which have ceased to be passive tools and turned into active agents capable of influencing social structures. Traditional paradigms that reduce technology to passive tools are no longer able to explain modern technological processes. Of particular relevance to the work is the consideration of such modern phenomena as high-frequency trading algorithms, platform economics (using the example of Uber), and algorithmically controlled social networks. The author convincingly demonstrates that understanding technosocial autonomy is "not an academic abstraction, but a necessary condition for the survival of democracy, justice and human dignity in the digital age." Scientific novelty The scientific novelty of the work lies in the original synthesis of two philosophical traditions for understanding modern technological processes. The author does not just compare the ideas of Simondon and Luhmann, but forms on their basis a new interpretative framework for the analysis of technosocial autonomy. An innovative attempt is to apply Luhmann's concept of autopoiesis to technology, especially cyberspace, which is considered as a meta-system with autonomous properties. The author develops the thesis of the "moderate autonomy" of technology and demonstrates how high technologies violate the boundary between embedded and excluded causal relationships. An interesting contribution to scientific discourse is the interpretation of cyberspace as a system "that is as complex as its surroundings" and capable of "encompassing all forms of communication." Style, structure, content The article has a clear logical structure, divided into an introduction, two main parts (devoted to Luhmann and Simondon, respectively) and a conclusion. The presentation style is academic, with a high degree of terminological accuracy. The language of the article is characterized by philosophical depth and conceptual richness. The article covers a wide range of issues, from abstract philosophical concepts to concrete examples of technological processes. The author demonstrates a deep understanding of the works of both philosophers and the ability to interpret them creatively. Simondon's analysis of the concept of "transduction" is particularly valuable, using the example of crystallization as a metaphor for technological development. At the same time, there is a certain imbalance between the parts devoted to Luhmann and Simondon: the first part is much more extensive and detailed, which creates some asymmetry in the structure of the article. In addition, in some places the author could have more clearly articulated his own position in relation to the analyzed concepts. Bibliography The bibliographic apparatus of the article is extensive and representative. The author uses primary sources in the original language (works by Simondon in French, Luhmann in English), which indicates a high level of research culture. It is noteworthy that the bibliography includes not only classical texts by both philosophers, but also modern studies of their heritage (works by Reichel, Becker, Stigler, etc.). The bibliography reflects the interdisciplinary nature of research, including sources on philosophy of technology, sociology, cybernetics, and systems theory. Appealing to opponents The author demonstrates a balanced approach to possible objections. The article contains elements of a polemic with anthropocentric approaches to technology, which the author, following Simondon, considers a "metaphysical error." The author also anticipates possible objections to the thesis of cyberspace autonomy, noting that "one could argue that cyberspace depends on material, infrastructure, and support from other social systems," and provides an answer to this objection through Luhmann's "paradox of simultaneous total dependence and total autonomy." At the same time, the author could take a closer look at modern critical approaches to technological determinism and position his concept more clearly in the context of current debates about technological development. Conclusions, interest of the readership The conclusion of the article consistently and logically follows from the analysis. The author convincingly demonstrates how the synthesis of Simondon and Luhmann's ideas allows us to form a new perspective on technosocial autonomy in the digital age. Especially valuable is the conclusion that "technosocial autonomy, viewed through the prism of the ideas of Simondon and Luhmann, turns out to be not a paradox, but a dialectical unity." The article is of undoubted interest to a wide range of specialists: philosophers of technology, sociologists, researchers of digital culture, specialists in system analysis. The practical significance of the research lies in the formation of a conceptual framework for analyzing the interaction of technology and society, which can be useful in developing regulatory mechanisms for digital technologies and strategies for technological development. The author's final question is: "How to preserve humanity in a world where algorithms are increasingly making decisions for us?" — highlights the ethical dimension of the problem and opens up prospects for further research in this area. In general, the article represents a significant contribution to the philosophical understanding of technology and deserves publication in a scientific journal of the relevant profile.
We use cookies to make your experience of our websites better. By using and further navigating this website you accept this. Accept and Close