The Contributory Revolution. Pierre Giorgini
we are in the case where the network has no central intelligence, and where each vehicle carries its own intelligence and is responsible for adjusting to its environment to which it is permanently connected.
The intelligence is distributed and is a “diffused web of links” within the system. In this case, it is impossible to extract or address a part of the fabric to model or mathematicize it. The understanding of the phenomena is then endogenous and intimately linked to the interrelations that the “Intelligence” itself maintains within the fabric. The “Intelligence” is then an integrated stakeholder in the understanding of the system. “Intelligence” will no longer be able to reduce and massively transform the dynamics of the whole but will be able to interact on the dynamic reality of the fabric of links at a given moment as a stakeholder. Its only capacity of intervention is a capacity of influence, of alliance with the fabric, in order to influence its systemic dynamics from within. To take the example of the network of driverless vehicles, no intelligence will be able to instantly stop all the vehicles at the same time.
It will intervene on an infinite system of interactions that includes itself and includes the infinite interactions that it has with the fabric. It will then perceive the system as unmanageable or uncontrollable in its entirety for two reasons. The first stems from the infinite complexity due to the fact that each mesh is made up of smaller meshes, down to the infinitely small. The second comes from the fact that “Intelligence” is itself included in the locality, being part of the whole, of the modeling and transformation actions and therefore cannot conceive it in its entirety. There is recursivity in the interaction between intelligence and system. This is not only a problem of synchronic complexity, it is also due to the capacity to change direction at a local level.
Unlike the first conception, this “Intelligence” can no longer think it can massively control the system from the outside. Indeed, total control would amount to controlling itself, but above all, the infinite complexity of the system hinders accurate and predictive modeling. This imposes greater caution on it because in the event of an error or a runaway, it cannot imagine massively correcting the damage caused, including to itself. We will then be faced with an endo-contributive conception, i.e. endogenous and contributory. “Intelligence” will have to be content to contribute from the inside to the dynamics of the whole. It will have to accept that any reduction of the system into simple elements that it operates is incomplete and that any formal model that allows this reduction cannot demonstrate its own universal relevance because it is always based on a certain number of predicates that cannot be formally demonstrated by the theory that they generate (Gödel’s second theorem (1931)2).
1.3. Our relationship to language is in question
These two conceptions (ENC and EXD) can also be characterized by the analysis in each case of the “Intelligent” activity of substantiation in the sense of acquiring the capacity to name, to create a signifying noun, which in the extension of the notion of place amounts to designating a formal locality. This is the fundamental capacity of human intelligence in relation to language which consists of designating – as a set of coherent properties modeled and identified by Intelligence – an object, a set of objects or a phenomenon of the system by a common noun, verb or adjective that denotes it. In the case of the exo-distributive model, this activity will first seek to be thought of as accurate, because it is descriptive of what is unequivocally observed and interpreted from the outside. The causal relationship (cause/effect and effect/cause) will be considered objective and not subject to interpretation. Deterministic evolutionary equations will be considered accurate within a specific range of validity. In the endo-contributive case, on the other hand, the activity of substantiation cannot be isolated from the system itself. The modeling of the system creates the system. There is an intrinsic recursivity to the so-called “Intelligent” activity. The subject creates the phenomenon or the object as interpretation and therefore substantiation, which in turn creates the subject as part of the system which is characterized by the phenomena.
It is important to note that in real life, the systems encountered will, for the most part, be hybrids of these two conceptions, notably depending on the scale at which we will be conceiving or observing them.
1.4. The epistemological mutation of the sciences
The epistemology of science is an exciting study and one which is of renewed interest today, as the question of the boundary between physics and metaphysics is at the heart of contemporary debate. This stems, in particular, from the increasingly shared sense, when looking closely at the current technoscientific revolution, that we are entering not into the unknown, but into the unknowable. It is hard to believe what is happening. The constantly updated work on the origin of knowledge and the understanding of its deep mechanisms of constitution is increasingly indispensable for understanding and analyzing the contemporary world, not least owing to the emergence of the most bizarre scenarios about the exo-somatization of capacities of the human brain. “Even if we could hope to understand everything, then we will have to understand what it means to understand” (Mark Planck’s 1944 speech).
I would like to put forward here the idea – which will be substantiated in the rest of the book – that the constitution of knowledge at the heart of the mathematical–physical relationship is fundamentally exo-distributive, whereas what is at work in the life sciences is by nature endo-contributive. Indeed, there is an epistemological divide between the sciences of inert matter, based essentially on physics as a relationship to reality, and those of the living world sustained by biology. To defend this point of view, which we will be further developed in the following chapters, it is appropriate to briefly return to the foundations and history of knowledge.
The reading of two books and a series of interviews have recently advanced and challenged my thinking on the subject of the epistemology of science. These works will be widely cited in the sections that follow: Quantum Physics, a Philosophy (Bitbol 2008) and Mathematics and Natural Sciences (Bailly and Longo 2011). Indeed, these readings have led me to consider further some of my earlier convictions or intuitions, and have also overturned some of them.
The first is the role played by the mobility achieved by humans in particular, who, at the top of the pyramid of living things, have been able to abstract themselves from their physical relationship to time and space, to derive the first spatio-temporal formalisms. This is a genesis of formalisms first of all, very much linked to sensory perceptions, such as toying with infinite continuity, the perception of the successor in time or space as the initiator of the succession of whole numbers, then the emergence of proportions opening on to the fraction. Its combination with the concept of the successor, leading to rational numbers. We can also cite the concept of the straight line as the shortest path from one point to another, the circle and so on. Each time, as Bailly and Longo (2011) tell us, a formal construction principle is put into action and by its confrontation with the principle of proof, acquires its status of “truth”. Thus, principles of construction and principles of proof are intimately linked. Within mathematics itself, this applies between semantics and syntax or logicism, and also between mathematics and experimental physics. In addition, other ancient notions, which will be mathematized much later, will appear, some as a condition of the survival for mankind. For example, the notion of causality, and which causes induce which effects. There is no smoke without fire; the sudden fleeing of a flock can indicate the presence of a dangerous predator, and so on. Another notion arose from sensory experience well before its mathematization: the probability that something will occur. Agrarian societies soon learned the idea of the likelihood that it would rain; today, we would talk of probability based on a certain number of factors, but it would be without certainty.
But this process is not only deductive; it is also inductive, when experience and measurement require an evolution of formalism. This leads to a fruitful interaction through co-fertilization, through the “physicalization” of mathematics. Direct access to tangible reality, to experimentation and direct measurements, where the observable and the measurable merge, will exhaust this phase of the history of physical sciences, with Newtonian physics at its peak. We could say that this whole process will conform to the principle of intuition. What is born in the principle