02 September 2010

Theology and the new physics: the rediscovery of the observer

5.10 The observational basis of quantum theory

Einstein’s explanation of the null result of the Michelson–Morley experiment led to a radical revision of our understanding of space and time. If anything, the explanation for Lord Kelvin’s other ‘cloud’ – the spectrum of black-body radiation – has led to even more radical changes in our understanding of the world.

5.10.1 The ultraviolet catastrophe

In line with Kelvin’s warning, the first crack in the edifice of classical physics came with attempts to explain the colour of hot objects using classical physics and electromagnetism. The light from these objects is a mixture of different frequencies (colours). Observations reveal that such objects have a distinctive spectrum (pattern of energy distribution at different frequencies). However, attempts to explain this in classical terms failed abjectly – they predicted instead that the amount of energy would tend towards infinity at the high-energy (violet) end of the spectrum – an ultraviolet catastrophe.

Enter Max Planck. In 1900 he suggested that physics should abandon the assumption that electromagnetic energy is continuous and wavelike. If, instead, energy can only be absorbed and emitted in discrete packets (or quanta), theory can be made to fit observations exactly. However, while his suggestion certainly gave the right answer, its abandonment of a cherished assumption of classical physics gave it an air of contrivance that led to its relative neglect for several years.

5.10.2 The photoelectric effect

Another anomaly that concerned physicists at the beginning of the century was the ability of light to eject electrons from metal. The principle is simple – the light imparts energy to electrons which then effectively ‘evaporate’ from the surface of the metal. The classical analogy with the evaporation of water suggests that some degree of evaporation should occur regardless of the frequency of the light, provided it is sufficiently intense. In reality, there is a clear threshold frequency, which varies from metal to metal, below which the effect will not occur.

It was Einstein who, in 1905, rehabilitated Planck’s quantum theory and explained this anomaly by assuming that the energy imparted by the light is packaged (quantised) in a manner that is related to the frequency of the light rather than spread evenly over the wavefront. Furthermore, he assumed that the way in which electrons absorbed that energy is also quantised – so that they can only acquire the energy necessary to escape if the light is of sufficiently high frequency. Light of frequencies lower than this threshold has no effect regardless of the intensity of the light source.

5.10.3 Collapsing atoms and spectral lines

In 1898 Antoine-Henri Becquerel had discovered an entirely new physical phenomenon – radioactivity. This growth area in physics rapidly led to the realisation that atoms were not simply inert billiard balls. On the contrary, they have an internal structure. By the end of the first decade of the twentieth century sufficient research had accumulated for physicists to be able to begin making models of this structure. It was clear that atoms had a very small, dense, positively charged nucleus surrounded by negatively charged electrons.

Ernest Rutherford proposed a planetary model for the atom – electrons in orbit around a nucleus like planets around a star. The fly in the ointment was electromagnetism. An electric charge moving in a circle emits energy. If electrons were classical particles emitting energy in this way, they would very rapidly dissipate all their energy and fall into the nucleus.

A solution was offered by a young Danish physicist, Niels Bohr, whose model of the atom was mentioned in 1.9. Again the key was the abandonment of continuity in favour of quantisation. Bohr simply ruled out the possibility of electrons occupying every possible orbit. Instead they are confined to certain discrete energy levels. Although outlandish, his suggestion had the added attraction that it explained another anomaly – the fact that the light emitted by hot gases is emitted only at certain frequencies (spectral lines).

5.10.4 When is a particle a wave?

The above phenomena indicated that under certain circumstances light can behave in a particle-like manner rather than its usual wave-like manner. The next step in the development of a quantum view of the world was due to an aristocratic French physicist, Prince Louis de Broglie. In his 1924 doctoral thesis, de Broglie proposed that, under certain circumstances, particles might be observed behaving in a wave-like manner. His prediction was confirmed in 1927 by Clinton Davisson and Lester Germer. They found that low-energy electrons fired at a nickel surface were deflected into a series of low and high intensity beams. In other words, diffraction – a characteristic property of waves – occurred.

Similarly, if you take a beam of electrons and pass it through a pair of slits (a classical wave experiment), you get diffraction and interference – properties characteristic of a wave rather than a particle. If you reduce the intensity of the electron beam to a single electron at a time, the detector on the other side of the slits will still gradually accumulate a trace that looks like an interference pattern (see Figure 5.1). Explaining this in classical terms is impossible – if the electrons go through one slit or the other the pattern would look quite different.[1] 

5.11 The quantum revolution
By the early 1920s, these anomalies had grown into a gaping hole in the fabric of physics. At the same time, the explanations proffered by physicists such as Einstein and Bohr held out the promise of a radical reconstruction. The task of integrating these insights into a coherent theory of sub-atomic physics fell to Werner Heisenberg and Erwin Schrödinger. Although they were working independently, their approaches were sufficiently similar to be formally merged into quantum mechanics. This new theory constituted a radical shift in the conceptual foundations of physics. We mention here three key aspects of quantum mechanics.

(i) Wave–particle duality

De Broglie’s demonstration of the possibility of electron diffraction highlights one of the fundamental features of the new theory – wave-particle duality. This is one of the properties enshrined in the fundamental equation of quantum mechanics, the Schrödinger wave equation – so-called because it takes a mathematical form characteristic of classical wave equations. However, this equation does not refer to physical waves but rather to probabilities, e.g. the probability of finding an electron in one location rather than another. The final outcome may be determinate (an electron in a particular location), but the probability distribution of the possible outcomes has the mathematical form of a wave. This peculiar feature of a very successful equation has led to the intractable problem of how we should interpret quantum mechanics (see 5.13).

(ii) Uncertainty

Uncertainty is one of the best-known implications of quantum mechanics. In 1927 Heisenberg argued that key physical quantities (e.g. position and momentum) are paired up in quantum theory. As a result, they cannot be measured simultaneously to any desired degree of accuracy. Attempts to increase the precision of one measurement result in less precise measures of the other member of the pair.

Take an electron, for example. We might try to determine its position by using electromagnetic radiation. Because electrons are so small, radiation of very short wavelength would be necessary to locate it accurately. However, shorter wavelengths correspond to higher energies. The higher the energy of radiation used, the more the momentum of the electron is altered. Thus any attempt to determine the location accurately will change the velocity of the electron. Conversely, techniques for accurately measuring the velocity of the electron will leave us in ignorance about its precise location. Looked at conservatively, this is an epistemological issue: quantum uncertainty is a principle of ignorance inherent in any measuring technique used on such a small scale. However, Heisenberg himself took a more radical view – he saw this limitation as a property of nature rather than an artefact of experimentation. This radical interpretation of uncertainty as an ontological principle of indeterminism implies that quantum mechanics is inherently statistical – it deals with probabilities rather than well-defined classical trajectories. Such a view is clearly inimical to classical determinism. Equally clearly, this is a metaphysical interpretation that goes beyond what is required by the mathematics of the Uncertainty Principle.

(iii) Radical interdependence

In spite of his crucial role in the early development of quantum mechanics, Einstein was very uneasy about its implications and, in later years, organised a rearguard action against it. His aphorism ‘God does not play dice’ highlights the depths of his distaste for quantum uncertainty. His strongest counter-argument was to call attention to a paradoxical implication of quantum mechanics now known as the Einstein–Podolsky–Rosen (EPR) Paradox.

Take, for example, a pair of protons whose quantum spins cancel out. Now separate them and measure the spin of one proton. Because they were paired, they had a combined wave equation. Measuring the spin of one proton ‘collapses’ that wave equation and determines the spin of the other. It appears that a measurement in one place can have an instantaneous effect on something that may be light years away.

For Einstein this was proof that quantum mechanics must be incomplete. To him this result only made sense if the spins were determinate (but unknown to us) before the protons were separated. In this case, measurement would merely tell what was always the case. But, according to the orthodox interpretation of quantum mechanics, it is not merely a matter of ignorance. The spin is not determined until it has been measured. In other words, the pair of protons cannot be regarded as separate entities until the measurement has been made.

Some years later, a quantum logician turned this paradox into a testable prediction that now bears his name – Bell’s Inequality. This is an equation which should be true if two principles (assumed by Einstein and his colleagues in formulating the EPR Paradox) hold in the world:
The principle of reality: that we can predict a physical quantity with certainty without disturbing the system, and
The locality principle: that a measurement in one of two isolated systems can produce no real change in the other.
Taken together, these principles imply an upper limit to the degree of co-operation that is possible between isolated systems. In 1982 a team of physicists at the University of Paris led by Alain Aspect demonstrated experimentally that this limit is exceeded in nature. In other words, our physical descriptions of the world in which we live cannot be both real and local in the above sense.

What this means in practice is a greater emphasis on describing quantum-mechanical systems as a whole. This runs counter to the tendency of classical physics towards ‘bottom-up thinking’– treating systems as collections of separate entities, and trying to reduce their properties to the individual properties of the simplest possible components. The quantum world, which deals with the simplest entities we know, seems to resist this reduction – it is in Karl Popper’s famous phrase ‘a world of clouds’ as well as ‘clocks’ (quoted in Polkinghorne, 1991:44). ‘Bottom-up’ thinking has served science extremely well, we simply indicate here that it has its limitations.[2]

5.12 Shaking the foundations

The quantum view of the world departs from classical assumptions in four main ways.

1.         Determinism has given way to an emphasis on probabilities. We simply do not have access to enough information to make deterministic predictions. And this is widely held to be a feature of the world rather than an observational limitation.
2.         Reductionism has given way to a more holistic approach to physical systems.
3.         Closely allied to this, locality (the impossibility of information being propagated instantaneously) has given way to correlation-at-a-distance.
4.         Most basic of all, the classical assumptions of continuity and divisibility (that between any two points there is an infinite number of intermediate values) have given way to quantisation – for certain physical quantities, the range of permissible values is severely restricted.

5.13 Schrödinger’s cat and the meaning of quantum theory

The EPR Paradox described in 5.11 (iii) introduces us to one of the basic problems of quantum mechanics – the relationship between measurement and reality. This is highlighted by a famous thought-experiment involving a hapless cat. The cat is in a box together with a canister of poisonous gas connected to a radioactive device. If an atom in the device decays, the canister is opened and the cat dies. Suppose that there is a 50–50 chance of this happening. Clearly when we open the box we will observe a cat that is either alive or dead. But is the cat alive or dead prior to the opening of the box?

Interpretation (i) Quantum orthodoxy (Copenhagen interpretation)

The dominant view in quantum mechanics is that quantum probabilities become determinate on measurement – that the wave function is collapsed by the intervention of classical measuring apparatus. This means that the cat is neither alive nor dead until the box is opened. The cat is in an indeterminate state.

This interpretation is usually allied with a tendency to extreme instrumentalism (see 1.7). On such a view the probabilities generated by the Schrödinger wave equation do not correspond to any physical reality. There simply is no reality to be described until an act of measurement collapses the wave function. Quantum mechanics is merely a useful calculating device for predicting the possible outcomes of such acts of measurement.

In spite of its dominance in the textbooks, this interpretation is hardly satisfactory. To begin with, it may be regarded as proposing a dualism in physical reality: two worlds – an indeterminate quantum world and a determinate classical world. Then there is the problem of what constitutes classical measuring apparatus. At what level does the wave function actually collapse?

The act of measurement that collapses the wave function cannot be limited to scientific instruments. After all, why should we assume that our scientific measurements are solely responsible for collapsing the wave function? This would give rise to a most peculiar world – one that was indeterminate until the evolution of hominids.

Some physicists, e.g. Wigner and Wheeler, have identified the classical measuring apparatus of the Copenhagen interpretation with consciousness. If so, they must be using a much broader definition of consciousness than is usual. What level of consciousness would be needed to make something determinate? Is the cat sufficiently conscious to determine the outcome of the experiment? Would earthworms do? What about viruses? The effect of pursuing this line of inquiry is to move towards a form of panpsychism – the doctrine that every part of the natural world, no matter how humble, is in some sense conscious!

An alternative might be to postulate a transcendent world observer – a divine mind whose observations collapse the wave functions on our behalf. In effect this would be the quantum-mechanical version of Bishop Berkeley’s idealism.[3] This is memorably summarised in a couple of limericks:

There was once a man who said ‘God
Must think it exceedingly odd
If he finds that this tree
Continues to be
When there’s no one about in the quad.’

And the reply:

Dear Sir, Your astonishment’s odd:
I am always about in the quad.
And that’s why the tree
Will continue to be,
Since observed by Yours faithfully, God.

The problem with this attractive solution to the measurement problem is that it proves too much. Invoking a divine observer leads to the question of why there should be any quantum measurement problem at all. Why should anything be left indeterminate for us to determine by our measurements? Is God only interested in those aspects of creation that are above a certain size?

Returning to the classical measuring apparatus, perhaps we should put the emphasis on ‘classical’ rather than ‘measuring’ – stressing not so much our intervention in the system as a transition from the world of the very small, in which quantum principles operate, to the everyday world of classical physics. This neo-Copenhagen interpretation has the merit that it avoids the absurdities of the consciousness-based approaches. However, we are still faced with the difficulty of identifying an acceptable transition point. How small is small? (There is now evidence that a molecule of fullerene containing around seventy carbon atoms can exhibit wave-particle duality.) One suggestion is that we choose the level at which physical phenomena become so complex that they are irreversible. Another, from Roger Penrose, is that gravity provides the key (Penrose 1994:Ch.6, 1997:Ch.2)..

Interpretation (ii) Hidden variables (neo-realism)

Einstein was not alone in finding this interpretation of quantum mechanics objectionable. A few physicists have persisted in arguing that the statistical nature of quantum mechanics implies that it is only really applicable to ensembles of particles (just as an opinion poll is only meaningful if a reasonable sample of the population has been polled). In other words, quantum mechanics is an incomplete description of reality. They maintain that underlying this level of indeterminacy there is an objective foundation.

The best-known hidden-variables theory is that of the physicist and philosopher David Bohm (see Bohm, 1980). What Bohm did was to distinguish between the quantum particle, e.g. an electron, and a hidden ‘guiding wave’ that governs its motion. Thus, in this theory electrons are quite clearly particles. When you perform a two-slit experiment, they go through one slit rather than the other. However, their choice of slit is not random but is governed by the guiding wave, resulting in the wave pattern that is observed.

The main weakness of Bohm’s theory is that it looks contrived – which it is. It was deliberately designed to give predictions that are in all details identical to conventional quantum mechanics. His aim was not to make a serious counterproposal but simply to demonstrate that hidden-variables theories are indeed possible.

It is sometimes suggested that hidden-variables theories have been ruled out by the Aspect experiment (5.11 (iii)). This is a misunderstanding of the experiment. What it did was to show that attempts to explain quantum phenomena cannot be both deterministic and local. Hidden-variables theories, with their underlying determinism, must be non-local, maintaining the existence of instantaneous causal relations between physically separated entities. Such a view contradicts the simple location of events in both classical atomism and relativity theory. It points to a more holistic view of the quantum world. Indeed Bohm himself stressed the holistic aspect of quantum theory in his later years, after his conversion from Marxism to theosophy.

Interpretation (iii) The many worlds interpretation

The third main class of interpretations starts from the assumption that scientific theories ought to be self-interpreting. The Schrödinger wave equation in quantum mechanics is smooth, continuous and deterministic. There is nothing in it that corresponds to the collapse of the wave function.

In 1957 Hugh Everett surprised his more conventional colleagues by proposing that the Schrödinger wave equation as a whole is an accurate description of reality. There is no collapse of the wave function. Whenever there is a choice of experimental outcomes, all the possibilities are realised. Somewhere Schrödinger’s cat will be really dead and somewhere it will be really alive. With each decision at the quantum level the universe splits into a number of isolated domains, each corresponding to a different outcome. In one universe the cat dies, in another it lives.

Most physicists find this extremely unattractive. One of the most venerable assumptions of the scientific method is Ockham’s razor – non sunt multiplicanda entia praeter necessitatem; i.e. entities are not to be multiplied beyond necessity. In practice this leads to a very strong aesthetic bias in favour of the simplest possible explanation.

Only quantum cosmologists beg to differ. They attempt to apply quantum mechanics to the entire universe. Clearly this leaves no room for a separate classical measuring apparatus. In this context, a many-universes approach such as was described above may seem an attractive non-theistic alternative to the notion of a transcendent world observer. But one wonders which option requires the larger act of faith!

5.14 Quantum consciousness

In classical mechanics, with its close association with Cartesian dualism, the physical world was neatly divorced from the realm of consciousness. As far as classically minded materialists were concerned, the latter was a mere side effect of biochemical interactions. However, as noted above, the dominant Copenhagen interpretation of quantum mechanics envisages a greatly expanded role for the observer. Granted the traditional association of temporal perception with consciousness, the rediscovery of time by modern physics may point in the same direction.

Such considerations have given rise to the suggestion that consciousness itself may be interpreted as a quantum phenomenon. Perhaps the best-known proponent of a quantum explanation of consciousness is Roger Penrose (1989; 1994; 1997). He rejects the currently popular view that human consciousness is essentially computational (that minds are analogous to computer programs) because, in his opinion, this model fails to account for intuitive problem-solving. The brain must be non-algorithmic (i.e. it does not operate by mechanically following a fixed set of procedures in the manner of a computer). Further he argues that classical physics is inherently algorithmic in its nature. Thus consciousness is not explicable in classical terms.

The obvious candidate for a non-algorithmic process in physics is the quantum-mechanical collapse of the wave function. Penrose suggests that the brain uses quantum collapse to solve problems non-algorithmically. But by what means? He pins his hopes on structures called microtubules that occur within cells, speculating that quantum effects within the microtubules may be co-ordinated across groups of neurones to provide the basis for such intuitive processes. However, as many physicists and neurophysiologists have pointed out, this is highly speculative – a weakness that Penrose himself acknowledges.


[1] For a discussion of the two-slit experiment in terms of quantum theory see Davies, 1990:108–11.
[2] For a brief discussion of ‘bottom-up’ and ‘top-down’ thinking see Peacocke (1993: 53–55). See also our ‘note on emergence’ in 6.11.1. We take up the question of ‘top-down causation’ in relation to divine action in 10.9(iii)[c]. .
[3] George Berkeley (1685-1753) – a philosopher famous for his apparent denial of the reality of any external world.

No comments: