The Existential Limits of Reason. Vladislav Pedder
process occurs through cyclic feedback:
Prediction: The higher level generates a prediction and sends it down the hierarchy.
Comparison: At the lower level, this prediction is compared with the actual sensory signal.
Error: If there is a discrepancy, a prediction error is generated.
Model Update: The error is sent back upward, where the model is adjusted to improve future predictions.
When the real sensory information matches the predictions, the brain minimizes the prediction error, which helps conserve resources. However, if the information does not align with expectations, a prediction error occurs, signaling the need to update the world model.
In the brain’s neural layers, there is a division between “prediction neurons,” which form expectations, and “error neurons,” which signal when predictions are not met. For example, in the supragranular layers (upper layers of the brain), there are error neurons that activate when something unexpected occurs. In the deeper layers, there are neurons that provide prediction signals.
However, the effectiveness of predictive coding is influenced by various factors, including hormones, neurotransmitters, microbiota, and injuries. Hormones, such as cortisol, produced in response to stress, can alter neuron sensitivity, affecting the brain’s ability to adapt and learn. Neurotransmitters, such as dopamine, play a key role in motivation and reward processes, which can enhance or diminish certain predictions and responses. The gut microbiota, interacting with the central nervous system, can influence mood and cognitive functions, reflecting in the process of prediction. Injuries, especially brain injuries, can disrupt the normal functioning of neural networks responsible for predictive coding, leading to cognitive and emotional disorders.
Errors in the process of predictive coding can occur for various reasons. They may be related to insufficient accuracy of sensory data, incorrect interpretation of information, or failure to update world models. Such errors can lead to distorted perception and impaired adaptive behavior. For example, during chronic stress, elevated cortisol levels can reduce the brain’s ability to adjust predictions, resulting in persistent perceptual errors and increased anxiety.
Thus, predictive coding is the foundation of adaptive behavior and human cognitive functions. Understanding the mechanisms of this process and the factors that influence its efficiency opens new horizons for the development of treatments for various mental and neurological disorders related to disruptions in predictive coding.
Conclusion
The emergence of the mind is the result of a complex evolutionary process that has led to the development of various forms of intelligence in different species. Predictive coding and Bayesian approaches demonstrate how the brain creates models of the world and adapts to new conditions, minimizing prediction errors. These mechanisms form the basis of our perception, learning, and thinking, making the mind a powerful tool for understanding and transforming reality.
4. Existential Limits of Forecasting
Mental models are internal cognitive structures through which we conceptualize and predict the world. These models help us navigate life by creating more or less accurate representations of reality. However, like any other tool, they are limited. Mental models, much like filters through which we perceive the world, are inevitably simplifications based on experience and expectations, allowing us to interact with the environment more efficiently. Yet, like any tool, these models cannot always accurately reflect reality, as the world does not always fit into the frameworks we create for it.
In Plato’s philosophy, these ideas find their continuation. In the famous “Allegory of the Cave,” Plato depicts individuals who, sitting in a dark cave, can only see the shadows cast by objects positioned in front of a fire. These shadows represent a distorted perception of reality, perceived as true because the cave dwellers have never seen the light. Only the one who escapes the cave can see the true reality hidden behind the shadows. Plato’s image symbolizes the limitations of our perception, which reflects only a fragment of the full picture of the world.
Later, Immanuel Kant argued that we perceive the world not as it is “in itself” (Ding an sich), but through the a priori forms of the mind, which help us understand the nature of these limitations. Kant believed that our knowledge of reality will always be constrained by the categories of the mind, such as space, time, and causality, which are imposed upon our experience and do not exist in the world “in itself.” This means that human perception will always be limited by these a priori forms, and we can understand and predict only those aspects of the world that fit within these frameworks.
The idea that our perception of the world is always limited was further developed in the later works of Thomas Bayes, whom we discussed earlier. In particular, Bayes used the example of the sunrise and sunset to explain how our models of the world can be updated based on observations. For instance, a person, stepping out of a cave for the first time, observes the sunrise and wonders: does this happen every day? With each new observation, they update their belief using Bayesian reasoning. With every sunrise, they strengthen their hypothesis that the sun indeed rises every day. However, if one day this prediction proves false, and the sun does not rise or set in its usual place, they will need to adjust their model of the world based on the new data.
Thus, in the Bayesian approach, we observe a process of continuous updating of our mental models based on new observations, which also echoes Plato’s idea of searching for true reality beyond distorted perceptions. Bayes emphasizes that perception and prediction of the world are dynamic processes that are always subject to adjustment, and that the reality we strive to understand may always be deeper than our current model of perception allows.
These ideas were further developed and expanded by Nate Silver2, who explored the principles of forecasting in conditions of uncertainty. Silver argues that successful forecasting depends on the ability to distinguish between “signal” (important information) and “noise” (random or insignificant data), which is directly related to Bayesian model updating. However, Silver goes further, emphasizing that not all models can be corrected simply by updating them with new data. In a world full of uncertainty and randomness, many predictions turn out to be incorrect, even if they follow the right methodology.
Silver emphasizes how people often overestimate their ability to interpret data, relying on predictions that seem plausible but may actually be the result of perceptual errors and biases. He explains that it is important not only to consider new data but also to understand the context in which it arises. In this sense, as in Bayesian models, the adjustment of mental models is a process that requires not only observations but also an awareness of the limitations we face when interpreting the world. Silver also underscores that the significance of “noise” in data is often overlooked, and without the ability to separate it from the “signal,” we will not be able to create accurate predictive models, even when using the most advanced data analysis methods.
Thus, like Bayesian theory, Silver emphasizes the importance of continually revising our assumptions and correcting our models of the world. However, unlike classical Bayesian theory, Silver points out the complexity of predictions in the real world, where the signal is often hard to distinguish from the noise, and our ability to make accurate predictions remains limited.
However, despite the fact that our mental models can be updated based on observations, even with all the complexity of predictions, the process of adapting to new data is not infinite. When the world becomes too complex, or when our expectations collide with fundamentally new and unpredictable phenomena, our models encounter limitations that cannot be overcome through conventional methods of adjustment. This opens up an insurmountable gap for the mind – a moment when we find ourselves unable to adapt our predictions to reality.
In such situations, when even the most flexible models prove powerless, the mind experiences a crisis caused by the inability to predict or comprehend what is happening. This
2
The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t (2012)