Newton’s Theory of Gravity was simple, beautiful and overwhelmingly successful. So, why did it happen that finally it was replaced by a better one? Not because of some few anomalies. The reason was ontological inconsistency: A force exerted by one mass on another mass through empty space and without any physical transfer is impossible. This was the reason that motivated Einstein to invent his revolutionary theory.
At the beginning, this inconsistency was clearly seen. In the course of time, however, it faded into the background: the ontological objections were pushed back by the success of the theory, in spite of the fact that they were completely cogent and therefore – as it turned out eventually – correct and an appropriate starting point for the better theory.
All contemporary physical theories contain ontological inconsistencies. In just the same way as was the case with Newton’s Theory of Gravity, these inconsistencies should serve as guidelines in the formation of future theories. So far, however, this potential could not be used, because in the early twentieth century the dogmatic decision was made that we have arrived at the limit of our cognitive ability and that it is impossible to develop concepts which enable us to describe reality beyond this limit. The discussion of these questions is generally considered to be settled. It seems plausible to refer the failure to develop the foundations of physics beyond the standard model to the prohibitive effect of this dogma: philosophical reflection cannot be replaced by a purely formal approach.
An example of a long standing ontological inconsistency: The nonlocal (i.e. outside the cone of light of the triggering event occurring) effects of Quantum Theory, which Einstein, Podolsky and Rosen pointed to in 1935. In their paper, nonlocality was used as an argument against the completeness of the QT. However, some decades later, Bell’s Inequality made it possible to prove experimentally that QT was correct and EPR were wrong.
It can be shown, however, that Bell’s disproof holds indeed only under the specific conditions of the model presented by EPR. Most physicists, however, believe that this disproof holds in general. They conclude from it that reality cannot be described locally under any circumstances.
The basic assumption of EPR is that the measurements are made on objects which possess attributes that exist independently of the measurements. At the same time, this assumption represents a necessary precondition for the disproof of the EPR model: Only under this condition, Bell’s Inequality can be derived.
Are there actually other possibilities? Against general conviction, the answer is yes. E.g. let us assume that – in an experiment with entangled photons – the discontinuous transitions between electron states interpreted as photons are caused not by the discontinuous but by the continuous aspect of light, in other words: by a continuous accumulation of the intensity of light waves. Then the precondition, which is necessary for deriving Bell’s Inequality, is not satisfied: Prior to measurement, neither the measurement objects nor their attributes exist. The attributes that actually exist are the polarization directions of the light waves whose accumulation causes the transitions – however these are not attributes of the measurement objects (“photons”); the measurement objects – and therefore also their attributes – are produced in the course of the experiment. Thus they cannot be seen as existing independently of the measurement.
Consequently, based on these model assumptions, the attempt to derive the QT-predictions for measurements on entangled photons in a strictly local way leads to success, and, moreover, it succeeds in a simple, completely understandable manner, as is shown here.
If locality can be regained in a model based on continuous waves only, the obvious next step is regarding the wave-particle dualism itself as ontologically inconsistent. (What else could it be?)
At first, this inconsistency was not visible. When Einstein presented his mechanical impact model of the Photoelectric Effect, he thought of particles which are embedded in waves and which carry the whole energy and momentum – this is exactly the objective dualism that was the basis of the EPR argument. Only much later it became apparent that it is this very assumption that makes Bell’s proof of nonlocality possible. Thus it became evident that the QT phenomena cannot be understood as particles within waves but have to be seen as something, which possesses the characteristics of both. However as such it is unthinkable and therefore certainly falls within the definition of ontological inconsistency.
Thus it is by no means surprising that the Photoelectric Effect, which stands at the beginning of the wave-particle dualism, can be described not only as a mechanical impact of particles but also – without any additional assumptions – as a pure wave phenomenon. (See here.)
This clear and simple result in turn leads to the assumption that a correction of our understanding of the foundations of physics – which are regarded as indispensable since Newton – might be necessary:
Particles and Interactions are seen as an essential basis of every description of nature. However if fundamental ontological inconsistencies can be eliminated – as in the case of the EPR paradox and the Photoelectric Effect – by the assumption that the QT phenomena are caused by continuous processes (waves), even if the phenomena themselves can only be observed in a discrete sequence, then the conclusion seems cogent that particles are not fundamental elements of physical reality but must be understood as stationary states of continuous processes or transitions between such states (see here).
This means: What is described by the models used in physics from Galilei up to now, does not represent the ultimate, causative fundament of reality. Of course these actually mechanical models must not be considered wrong due this assumption, however eventually they lead to counterproductive concepts and, later, to theoretical deadlock.
Further ontological inconsistencies can be found in the Theory of Special Relativity: The question what actually coordinates the time relations between different systems at any given distance – it must be the very same as that what vibrates in the case of light waves – cannot be answered within the hitherto used conceptual frame. Even the fact of relativity itself – as evidently it may seem to be based on the idea that space does not permit any determination of velocity – cannot be derived consistently within this frame. However I do not want to go into to much detail here. A more extensive description is given here. Anyway it should be mentioned that in the case of SR the elimination of ontological inconsistency and incompleteness leads to the same conclusion as in the case of QT: to the assumption of waves as basis of reality.
An attempt to deduce the necessary new basis of physics from philosophical arguments is made here. The results are promising. Physics is reduced to metrics and dynamics of space-time; Gravitation and Electromagnetism appear as consequences of one single fundamental law, and physical concepts and relations can be substantiated through metric-dynamic facts.
The shortest version of the point of view described above reads as follows:
Ontological inconsistencies are important guiding lines for the development of physical theories. Their elimination points to a continuous basis of reality. The attempt to build the physical description of the world on ontology again leads to the same basis. It seems appropriate for a metric-dynamic unification of physics.
In this way, physics becomes a metaphor for the general development of our civilization: On the one hand, it turns out that the project philosophy of nature can be continued in exactly the way which was intended by enlightenment: as cognition of nature – contrary to just formal description –, and, moreover, with an amount of understanding that would have been impossible within a mechanistic framework, and that the necessary knowledge is already partly contained in the existing physical formalisms; On the other hand, this development cannot take place, because the knowledge remains hidden behind a rigid obsolete interpretation.
This is to be compared with politics and the arts: trapped within a corset of ostensibly indissoluble social structures, inherent necessities and hardly questioned standards, they seem to be condemned to produce nonsense, even though the existing intellectual and material resources would allow them to fulfill their tasks in a reasonable manner – to an extent that would have been unthinkable in former times.
How can the historical process that has led to this condition be characterized, and what is its causation?
Answering such a question demands choosing a single one of many highly interconnected historical processes and declare it the most important and the main reason for all the following ones. This decision is to a certain amount arbitrary and subjective. In the case of the downfall of our civilization, however, the development of all cultural areas fits into a certain model in such a clear manner that this arbitrariness seems small enough.
What distinguishes the Christian western civilization from other ones? The amount of the dissociation between subject and object. What are the historical roots of this dissociation?
It originates from the split that is so deeply involved in Christianity: the split between spirit and body. In order to stay on the right way, a Christian has to control his human nature – his flesh – incessantly; thus the body turns into an object. This figure of thought – which is a central element of existential Christian orientation – is expressed by the main Christian symbol: the crucified Christ who sacrifices himself for our salvation from the original sin.
Due to the dominant position that the aggressive, controlling attitude towards the body occupies in Christian thinking and, especially, in Christian education, it becomes a general model for the behavior in relationships. This in turn represents the necessary condition for the insistent interrogation of nature – in other words: for natural science. Thus, in the case of appropriate political and economic conditions, a technology can unfold which permits to increasingly command nature – as was indeed the case since the late Renaissance.
The order of events follows a stringent inner logic:
Everything which is in any kind of relationship with us, is taken out of the unity of this relationship, treated as an object, analyzed and controlled.
This is the first level: Objectification.
Analyzing an object means determining its inner structure and the structure of its relation with us. Thus
the second level is: Functionalization.
Objects are useful or useless, pleasant or unpleasant. Therefore the determination of their function can never be neutral: it is always a function for us. Consequently,
the third level is: Optimization.
Defining an object in terms of profit is an incomplete definition. Every object has meaning within many contexts independently from us – and, if it is alive, it has its own meaning by itself, which is ignored by optimization. Therefore optimization will necessarily change into
the fourth level: Exploitation.
At last, if the exploitation continues ignoring what the object intrinsically is, follows
the fifth and last level: Extinction.
Here extinction does not mean physical annihilation – although this can be the consequence – but the completely executed reduction of the object to what has been determined as its profit for us.
This process comprises everything. It is valid for singular objects like production animals as well as for generalized objects like painting and music, it is executed with nature and with humans – literally everything is made an object and subdued to this five-level process.
Destruction of objects, however, is only one side of the process. Since the subject of culture can evolve only by relations with cultural objects, destruction of the objects means, at the same time, disappearance of the cultural subject.
Sense is what can be experienced by fulfilling the inner law that arises from object relationships, which form the different layers of identity. Meaning is what can be experienced by relating something to the historically developed systems of expression and interpretation, which have evolved from such relationships. Thus together with the objects also the categories sense and meaning are destroyed. (To illustrate this deprivation, compare a late self portrait by Rembrandt with the Black Square by Malevich.)
What remains after the decay of these categories? Only those intentions that are closely connected to the biological layer of our identity. In other words: the whole catalogue of values and purposes becomes reduced to only three elements: money, power and fun.
From many further concomitants of the just described destruction process I will single out only two, which I consider particularly important.
1. The incessantly accumulating knowledge replaces the mythological and religious explanations. In the process the expectation of salvation which was previously addressed to religion switches over to science. This expectation, however, must be frustrated: Except in mathematics and, to a certain extent, in physics, there is nothing which ultimate certainty can be built upon. There is only limited certainty built on reasoning. There is no salvation.
2. The replacement of the relationship with an object by its pure exploitation leads to a confusion of the object description which the exploitation is based on, and the object itself. This is fatal, because a description can never approximate the exceeding complexity of reality. If the virtual world ever will be mistaken for the real world, then not because it is in fact tantamount to it but because the experience of the world is degenerated to such an extent that the discrimination between blocks and trees or emotional agents and humans disappears.
A consequence of the confusion of object and description is the strange hubris, which goes along with every scientific trend. Although it is clear even at a low level of insight that knowledge and technical possibilities are limited, the protagonists consider themselves in any case close to total feasibility. This hubris is a characteristic attribute of artificial intelligence, and it is also a feature of genetics as well as of brain research.
However the picture of this historical process – which I call First Enlightenment – would not be complete without emphasizing the actually enormous growth of knowledge and its technical realization.
Can this knowledge work against the just outlined cultural and intellectual devastation and save us from the ultimate downfall?
This brings us to the concept of Second Enlightenment. In contrast to most prophets of the failure of enlightenment, I do not believe that self-destruction is the destiny of reason. What has just been outlined cannot be called reason. It may be true that the first intellectual flight of science, which needed the connection between reason and experiment, could emerge only from the split being and, consequently, was doomed to failure already from the beginning, however from this does not follow that the very same has to be the case with any rational kind of understanding the world and dealing with it.
Rather it follows from the analysis executed so far that science and enlightenment develop their own corrective. Actually we know that our way of exploiting all objects is – ultimately – not to our advantage. We have experienced that we are unable to model complex processes of nature – as we are learning all the time from climate researchers, who must correct their guesses permanently. Or, to give another example: It is actually completely clear that the idea of genetical improvement of humans is nothing but a ridiculous chimera. E.g. if we tried to construct intelligence genetically, we would neither know what to construct – we are not able to define intelligence in terms of neurons – nor would we know, even if the goal could be defined biologically, how to accomplish it. As is the case with most human attributes, intelligence is certainly the result of a highly complex, in multiple feedback loops organized co-operation of many genes, which can neither be formalized nor analyzed statistically.
Generally spoken: It is evident that the project natural science – in spite of its prodigiousness – is limited and that an understanding of nature, which is based solely on science, is only apparently rational. (To prevent misunderstanding: I do not think of going back to completely irrational systems like religion or esotericism.) However the habitual structures of thinking and acting are so rigid that this knowledge cannot take any effect.
Second Enlightenment means nothing else than integrating this knowledge and changing our behavior accordingly.
The technical limit of scientific thinking has been just exemplified. More important, however, is a limit on principle, which will be outlined in the following.
A central element of the scientific paradigm is the completeness assumption. It says: Everything which happens can be derived from initial conditions and laws of nature.
This assumption seems self-evident – just in the same way as the conviction of the mathematicians at the beginning of the twentieth century that the completeness of formal systems would be proven generally before long – until Kurt Goedel proved the contrary: formal systems which contain arithmetic can never be complete.
According to the completeness assumption of natural science all states of nature correspond to propositions of a system of axioms and rules (these are the initial conditions and laws of nature).
Nature produces incessantly new states, which again correspond to such propositions. The question is: Can all these propositions be derived within the system or are there undecidable propositions, that is: propositions of the Goedel type?
I argue here and here that nature is by no means tame enough to do us the favor of restricting itself to states which correspond to decidable propositions. Instead it produces – as a consequence of the generation of structures of increasing complexity – states which correspond to undecidable propositions in respect of any possible set of initial conditions and physical laws. (To keep the train of thought assessable, I restricted it to mental states. However it can be transferred to other areas, e.g. to the area of evolution.)
This involves a limit of science that exists not just because of technical reasons but due to reasons on principle. It reads as follows:
Any description of nature by a given system of axioms and rules is incomplete.
With this not only mental phenomena themselves, but also anything which humans are involved into, is withdrawn from the grip of natural science. Of course some aspects can be described by science, however the true nature of such phenomena is not subject of science.
In this way the Second Enlightenment finds its metaphysical basis: human beings are always more than their description; understanding them as objects is not only insufficient but essentially wrong. The same holds, depending on the level of complexity, also for other objects (which in fact loose their object status; they are no longer pure objects, they rather have a meaning in themselves, which is now ontologically justified.)
Science looses its function of being guarantor of ultimate justifications. There is no ultimate sureness. There is only self assurance based on reasoning. However this is, just as the sense, part of human nature and not an element which can be derived within a formal system.
The easiest way to understand the difference between First and Second Enlightenment is seeing enlightenment as emancipation.
Then First Enlightenment means:
Emancipation based on insufficient resources. The models of description are inappropriate for achieving the overambitious goals.
Blindness for the limits of doability.
Exploitation and destruction of all objects, whether they might be material or intellectual resources or living entities.
No real emancipation – the salvation expectation is only transferred from religion to science.
In contrast Second Enlightenment means:
Understanding the metaphysical limits of any formal description of the world; science and technology are wonderful achievements in the service of mankind, however they cannot comprise humans themselves completely. Human nature is not part of any scientific description.
The category object is pushed back by the category relation. Nothing is understood as a pure object. Everything is part of complex interdependencies, which also in total cannot be seen as an object.
Real emancipation – in the sense that all transcendence becomes immanent: There is not any absolute point of reference – like god or the ultimate physical theory –, which gives the measure of truth, good and beauty; they are always to be defined by ourselves. This definition is only now – due to the knowledge gained in the course of the First Enlightenment – possible to such an extent that the kind of emancipation which was meant from the beginning of enlightenment, at long last seems within our reach.
Pandemonium
Recently, the celebrated abstract painter F. met the painting gorilla Hedwig, who – as is generally known – is considered by some of the most distinguished art critics as the great dark hope for the renewal of postmodernism. Reportedly, the encounter of the two artists was very fruitful. On the very same day, after deeply meditating and retreating into ancient mysteries, the peerless actionist N. succeeded in crapping a figure of such majestic sublimeness that many spectators – amongst them several politicians – spontaneously fell to their knees and burst into tears.
Just at the same time the famous physicist D. consulted the shaman Pregnant Cloud for comparing the Reduction of the Wave Function with the shaman’s dream of the All-Creating Gaze of the Great White Bird and for clarifying the obvious connections between the physical concept of time traveling through wormholes and the closely related shamanic concept of the abolition of time by alcohol and other drugs. Likewise simultaneously the physicist H., in former times one of the most prominent string theoreticians, in the end obtained Satori; After a three-year period of heavy suffering, sitting naked and lonely on a tower in Cambridge and chafing his ass up to his tailbone, he realized that the ultimate Theory of Everything is unthinkable, because the Tao, which can be thought, cannot be the true Tao.
Further amazing events followed immediately: The genetically optimized sheep Kitty – a creature designed by the geneticist A. – declared to be much more intelligent than its creator and claimed his job, the brain researcher R. avered desperately he was just the unconscious speaking tube of his neurons and could therefore not stop talking nonsense, the Born Again Preacher Q. announced the beginning of Armageddon with an accuracy of one millionth of a second (Jerusalem local time, of course), and the Philosopher S. emphasized that all these incidents were very serious issues that should be kept under tight surveillance.
I deeply regret not to have witnessed these exceptional events directly, all the more since their temporal coincidence indeed cannot be considered accidental but has to be attributed to the influence of the morphogenetic field that caused the enormous cosmic tension, which could be felt so strongly in recent times. Without any doubt the long-expected intellectual jump of mankind is impending.
Where will it lead to?
Dear Reader! – I would not dare to invite you – from these climaxes on the intellectual peaks of the present where I imagine you breathlessly dedicated – down to the lost lowlands of simple reason, which seems to have vanished a long time ago but has been recovered here, if the yield was not so overwhelmingly rich. However it would be irresponsible not to warn you: With reason it is the same as with other strong drugs: if you are not prepared for it, it can lead to some pain, to shock or even to death by brain arrest – especially after such a long time of abstinence.