I love this kind of stuff.
BLACK HOLE COMPUTERS
• Merely by existing, all physical systems store information. By evolving dynamically in time, they process that information. The universe computes.
MORE TO EXPLORE
In keeping with the spirit of the age, researchers can think of the laws of physics as computer programs and the universe as a computer
What is the difference between a computer and a black hole? This question sounds like the start of a Microsoft joke, but it is one of the most profound problems in physics today. Most people think of computers as specialized gizmos: streamlined boxes sitting on a desk or fingernail-size chips embedded in high-tech coffeepots. But to a physicist, all physical systems are computers. Rocks, atom bombs and galaxies may not run Linux, but they, too, register and process information. Every electron, photon and other elementary particle stores bits of data, and every time two such particles interact, those bits are transformed. Physical existence and information content are inextricably linked. As physicist John Wheeler of Princeton University says, "It from bit."
Black holes might seem like the exception to the rule that everything computes. Inputting information into them presents no difficulty, but according to Einstein's general theory of relativity, getting information out is impossible. Matter that enters a hole is assimilated, the details of its composition lost irretrievably. In the 1970s Stephen Hawking of the University of Cambridge showed that when quantum mechanics is taken into account, black holes do have an output: they glow like a hot coal. In Hawking's analysis, this radiation is random, however. It carries no information about what went in. If an elephant fell in, an elephant's worth of energy would come out--but the energy would be a hodgepodge that could not be used, even in principle, to re-create the animal.
That apparent loss of information poses a serious conundrum, because the laws of quantum mechanics preserve information. So other scientists, including Leonard Susskind of Stanford University, John Preskill of the California Institute of Technology and Gerard 't Hooft of the University of Utrecht in the Netherlands, have argued that the outgoing radiation is not, in fact, random-that it is a processed form of the matter that falls in [see "Black Holes and the Information Paradox," by Leonard Susskind; SCIENTIFIC AMERICAN, April 1997]. This past summer Hawking came around to their point of view. Black holes, too, compute.
Black holes are merely the most exotic example of the general principle that the universe registers and processes information. The principle itself is not new. In the 19th century the founders of statistical mechanics developed what would later be called information theory to explain the laws of thermodynamics. At first glance, thermodynamics and information theory are worlds apart: one was developed to describe steam engines, the other to optimize communications. Yet the thermodynamic quantity called entropy, which limits the ability of an engine to do useful work, turns out to be proportional to the number of bits registered by the positions and velocities of the molecules in a substance. The invention of quantum mechanics in the 20th century put this discovery on a firm quantitative foundation and introduced scientists to the remarkable concept of quantum information. The bits that make up the universe are quantum bits, or "qubits," with far richer properties than ordinary bits.
Analyzing the universe in terms of bits and bytes does not replace analyzing it in conventional terms such as force and energy, but it does uncover new and surprising facts. In the field of statistical mechanics, for example, it unknotted the paradox of Maxwell's demon, a contraption that seemed to allow for perpetual motion. In recent years, we and other physicists have been applying the same insights to cosmology and fundamental physics: the nature of black holes, the fine-scale structure of spacetime, the behavior of cosmic dark energy, the ultimate laws of nature. The universe is not just a giant computer; it is a giant quantum computer. As physicist Paola Zizzi of the University of Padova says, "It from qubit."
When Gigahertz Is Too Slow
THE CONFLUENCE of physics and information theory flows from the central maxim of quantum mechanics: at bottom, nature is discrete. A physical system can be described using a finite number of bits. Each particle in the system acts like the logic gate of a computer. Its spin "axis" can point in one of two directions, thereby encoding a bit, and can flip over, thereby performing a simple computational operation.
The system is also discrete in time. It takes a minimum amount of time to flip a bit. The exact amount is given by a theorem named after two pioneers of the physics of information processing, Norman Margolus of the Massachusetts Institute of Technology and Lev Levitin of Boston University. This theorem is related to the Heisenberg uncertainty principle, which describes the inherent trade-offs in measuring physical quantities, such as position and momentum or time and energy. The theorem says that the time it takes to flip a bit, t, depends on the amount of energy you apply, E. The more energy you apply, the shorter the time can be. Mathematically, the rule is t = h/4E, where h is Planck's constant, the main parameter of quantum theory. For example, one type of experimental quantum computer stores bits on protons and uses magnetic fields to flip them. The operations take place in the minimum time allowed by the Margolus-Levitin theorem.
From this theorem, a huge variety of conclusions can be drawn, from limits on the geometry of spacetime to the computational capacity of the universe as a whole. As a warm-up, consider the limits to the computational power of ordinary matter--in this case, one kilogram occupying the volume of one liter. We call this device the ultimate laptop.