In this engaging and mind-stretching account, Vlatko Vedral considers some of the deepest questions about the Universe and considers the implications of interpreting it in terms of information. He explains the nature of information, the idea of entropy, and the roots of this thinking in thermodynamics. He describes the bizarre effects of quantum behaviour - effects such as 'entanglement,' which Einstein called 'spooky action at a distance' and explores cutting edge work on the harnessing quantum effects in hyperfast quantum computers, and how recent evidence suggests that the weirdness of the quantum world, once thought limited to the tiniest scales, may reach into the macro world.
Vedral finishes by considering the answer to the ultimate question: Where did all of the information in the Universe come from? The answers he considers are exhilarating, drawing upon the work of distinguished physicist John Wheeler. The ideas challenge our concept of the nature of particles, of time, of determinism, and of reality itself.
Amazon-Exclusive Author One-on-One: Paul Davies and Vlatko Vedral
Paul Davies: Like most physicists, you base your world view on quantum mechanics. What would it take to convince you that quantum mechanics is a flawed theory that needs to be replaced? Can you devise a straightforward experiment that is feasible in the near future that would test quantum mechanics in a new and crucial way?
Vlatko Vedral: It is indeed depressing that quantum physics has been so consistently accurate over the past hundred years. There is really no obvious deviation from experiments (we physicists would get really excited if there were). The main issue I think is how general the quantum superposition principle is: Can any property really be superposed? Roger Penrose, for instance, believes that gravity will prevent superposing a massive object in two different places. Along with many other physicists, I think that this is a technological (not fundamental) problem. On top of this, we are far away from being able to experiment with time and space on scales relevant for quantum gravity. A more interesting issue for me (as well as being more readily accessible to experiments) is the existence of two different types of particles, fermions and bosons. It seems that every particle we observe is either a fermion (electrons, for example) or a boson (photons, for example). But can it be that we can have a particle in a superposition between a fermion and a boson? We are now in a position to be able to attempt to superpose these two properties in practice. If we show that this cannot be done, however, it is not clear what this means for quantum physics. Some of us like to think of everything in the universe as being quantum and finding limitations even in one aspect would tell us that there might be more out there…
Davies: For many years Stephen Hawking claimed that information is irretrievably lost in black holes. Then he changed his mind. Where do you stand on the issue?
Vedral: If we really succeed in quantizing gravity then gravitational field should behave in a reversible manner like any other quantum field. In that sense, there is no information loss in a black hole. Reversibility means that information can always be recovered (I guess this is why Hawking changed his mind, though I have not seen anything in writing on this). However, if we show that gravity indeed wins over quantum physics (whatever this might mean – we don’t really know at present), then there might be some genuine loss out there. This then (by default) signals the end to the universality of quantum physics. I think that the jury is still very much out on this one, though I would tend to think that gravity will one day be quantized (or will be understood not to be a fundamental force) in which case the loss of information is probably not fundamental.
Davies: When humans communicate, a certain quantity of information passes between them. But that information differs from the bits (or qubits) physicists normally consider, inasmuch as it possesses meaning. We may be able to quantify the information exchanged, but meaning is a qualitative property – a value – and therefore hard, maybe impossible, to capture mathematically. Nevertheless the concept of meaning obviously has, well… meaning. Will we ever have a credible physical theory of “meaningful information”, or is “meaning” simply outside the scope of physical science?
Vedral: This is a really difficult one. The success of Shannon’s formulation of “information” lies precisely in the fact that he stripped it of all “meaning” and reduced it only to the notion of probability. Once we are able to estimate the probability for something to occur, we can immediately talk about its information content. But this sole dependence on probability could also be thought of as the main limitation of Shannon’s information theory (as you imply in your question). One could, for instance, argue that the DNA has the same information content inside as well as outside of a biological cell. However, it is really only when it has access to the cell’s machinery that it starts to serve its main biological purpose (i.e. it starts to make sense). Expressing this in your own words, the DNA has a meaning only within the context of a biological cell. The meaning of meaning is therefore obviously important. Though there has been some work on the theory of meaning, I have not really seen anything convincing yet. Intuitively we need some kind of a “relative information” concept, information that is not only dependent on the probability, but also on its context, but I am afraid that we still do not have this.
Davies: Quantum entanglement enables nature to process information exponentially faster than a Newtonian universe would. But could a different mechanics – neither Newtonian nor quantum – process information even faster still? Is there a “Vedral mechanics” with “vbits” that could outperform qubits in a race to find the answer to a mathematical question? If so, tell us about it!
Vedral: Oh, how I’d love to have a Vedral mechanics and vbits. Unfortunately, quantum physics is very successful and resists being replaced. However, based on the scientific progress so far (and, after all, it can’t be that we are so smart to figure out the ultimate theory after just 350 years of using the scientific method) I bet that there will be a new mechanics one day (albeit discovered by someone else – I am willing to bet quite a lot on this one). At present, and as far as I am concerned, this probably lies in the realm of the “unknown unknowns,” to borrow Donald Rumsfeld’s phraseology. The need for a new theory will, I think, come from a completely unexpected direction: There are things that we simply don’t know we don’t know.
Davies: In a system with more than about 400 entangled qubits, the quantum description entails more parameters (e.g. branches of the wave function) than there are particles in the universe. In fact, it entails more parameters than the total number of (classical) informational bits in the universe. Thus even an omniscient demon that performed a measurement and knew every bit of information about the universe that it is even in principle possible to read out and know, could not predict the behavior of the system. Does this therefore represent a fundamental cosmological limit to the predictability of quantum systems? Indeed, a new fundamental limit to what is knowable? Are we being idealistic to believe that quantum mechanics applies accurately when it involves more mathematical objects than could ever in principle be written down in the real universe, even by using up all its available resources?
Vedral: This is a deep question and I often think about it (mainly at night, like with all deep questions). Let me restate it slightly. We believe that all observable properties in quantum physics can be captured with mathematical objects called operators. And, more importantly, we believe that anything that is an operator can be observed (this is one of the postulates of quantum physics). However, as you illustrated above, there are things that we might never be able to measure due to lack of memory space, even though they mathematically represent legitimate mathematical operators. In this sense, one may argue that quantum physics contains seeds of its own destruction: It has in its foundations things that prove that they cannot be there! We have not really had to think about this question in the past since technologically we could never handle more than 20 qubits in a fully coherent manner. But now, with the rapid progress in various quantum computational technologies, it would not be surprising if we arrived at 400 qubits within 10 years or so. What would this mean? One possibility is: not much. Maybe in order to understand behavior of objects with 400 qubits or more, we don’t need more than a handful of observables that capture all the essence. After all, this is how we do solid state physics (here we are talking about at least a billion qubits). We don’t want to know all properties of a macroscopic solid, but only how well it conducts electricity, heat and how it responds to some external stimuli such as the magnetic field. The other possibility, however, is that we need to radically change the way we understand the world. Your argument would then imply that there is a more fundamental limitation to our understanding of the universe than implied by the Heisenberg’s uncertainty principle. It is simply the fact that the universe has a finite number of bits in it! Does this mean that there is a complementarity in what we can measure due to finite space over and above the quantum complementarity? Some people have in fact argued that quantum complementarity is nothing but a consequence of the finite space complementarity! However, like I said, I only think about this question during sleepless nights, and I’ve not had anywhere near enough of them to begin to do this question the justice it deserves.