Interesting People mailing list archives
'A Shortcut Through Time': Quantum Weirdness
From: Dave Farber <dave () farber net>
Date: Sun, 06 Apr 2003 06:23:58 -0400
'A Shortcut Through Time': Quantum Weirdness April 6, 2003 By JIM HOLT If you take a word for a perfectly ordinary activity and stick ''quantum'' in front of it, you get something that sounds mysterious and powerful -- or perhaps bogus. I have no idea what ''quantum healing'' or ''quantum creativity'' or ''quantum investing'' might be about. I have, however, heard quite a bit about ''quantum computing.'' The idea seems to have been born in the early 1980's in the mind of the physicist Richard Feynman. Since then, grandiose claims have been made for the quantum computer. In 1995, Discover magazine said it ''would in some sense be the ultimate computer, less a machine than a force of nature.'' One proponent, David Deutsch, maintains that quantum computing can prove the reality of parallel universes. The physicist and mathematician Sir Roger Penrose, in a couple of best-selling books, has linked it to the secret of human consciousness. Quantum computing would seem mysterious and powerful indeed, assuming it is not bogus. So one wants to know: Has anyone ever built a quantum computer? How are quantum computers supposed to work? And, most important, what could one do for me? Responding to a challenge posed by a magazine editor, George Johnson has written a blessedly slim book, ''A Shortcut Through Time,'' that gets across the gist of quantum computing with plenty of charm and no tears. Computer science is hard; quantum mechanics is weird. But Johnson, who contributes science articles to The New York Times and is the author of four previous popularizations, explains it all with Tinkertoys and clocks and spinning tops and just a little arithmetic. It's a briskly told story, driven entirely by ideas. Only when I got to the end of it did I realize that I wasn't quite as excited about the advent of the quantum computer as the author felt I should be. All computers, regardless of their hardware, embody the same idea: information -- numbers, words, images, sounds -- can be represented by anything that can be in one of two distinct states. A switch that can be either in the on or in the off position will do the trick. In the most powerful conventional computers, these switches are tiny silicon transistors. Each switch represents a binary digit, or ''bit.'' The more switches you have, the bigger the numbers that can be represented. Ten switches, for instance, can represent any one of the numbers from 0 to 1,023. Now consider a quantum computer. Quantum theory explains how the world works at the atomic level. One of its many incomprehensible features is that it allows things to be in two contrary states at the same time. An atom, for example, can spin like a top. You'd think a given atom would have to be spinning either clockwise or counterclockwise. But quantum theory tells us that if you hit an atom with a pulse of light of the right duration, it will enter a ''superposition'' in which it is doing both. Suppose we think of the atom as a switch, with clockwise spin meaning ''off'' and counterclockwise spin meaning ''on.'' Then a single spinning atom can represent 0 and 1 at the same time. A row of 10 such quantum bits, or ''qubits,'' can therefore be made to store not just one number from 0 to 1,023 but all of these numbers simultaneously. Superposition is not the only magic that this new kind of computer relies on. There's also ''entanglement.'' Quantum particles are said to be entangled when their fates are inextricably linked; if one is spinning clockwise, say, the other one has to be spinning counterclockwise. (Einstein regarded this as ''spooky.'') In a quantum computer, such dependencies are in effect the wiring among the switches. Thanks to superposition and entanglement, you can, by zapping our row of 10 spinning atoms with a laser gun, do a computation on all 1,024 numbers at a single stroke. It is this amazing quantum parallelism that affords what Johnson calls ''a shortcut through time.'' But when the computation is over, how do you read the results? Since you started with a great big mixture of questions, you're left with a great big mixture of answers. And quantum theory says you can't see each of them individually. When you try to measure a quantum system, the superposition collapses, and one of the answers pops out at random; the rest are destroyed. To get around this restriction, the quantum computer exploits a third kind of quantum weirdness, called ''interference.'' The multiple answers held in superposition -- which are sometimes thought of, rather extravagantly, as existing in multiple universes -- must be made to interfere with one another. Some answers are mutually reinforcing; others tend to cancel. With the right kind of massaging by laser pulses, the superposition collapses to a final result that reveals something about all of the parallel computations. That's how a quantum computer works in principle. In practice, there are two problems: the hardware and the software. First, the hardware. The guts of a quantum computer would certainly be compact: a single molecule of 13 atoms strung together, too tiny to see with a microscope, could outpace Blue Mountain, the supercomputer covering a quarter of an acre and used at Los Alamos National Laboratory to simulate nuclear explosions. So far, however, the record size for a quantum computer (set in 1999) is only seven atoms, and the researchers could get the little machine to hang together for only half a second -- just long enough to execute a couple of hundred computational steps. Quantum computers don't have to be made of atoms; any particle that can be manipulated into a superposition of two states will do for a qubit. (One rather exotic version mentioned by Johnson has been described as ''a computer in a cup of coffee.'') But all the technologies tried have proved extremely fragile. That leaves quantum computer scientists, as one of them put it, ''writing the software for a device that does not yet exist.'' But the software side is tricky too. If a quantum computer streaks past a classical computer in power gained, it limps behind in flexibility. You can't just sit down and write a quantum program that would, say, model the weather. Because quantum logic will not let you look at intermediate answers without destroying the computation, even getting a quantum computer to accomplish something as simple as factoring a number into its divisors needs a touch of genius. Yet in 1994, Peter Shor, a mathematician at Bell Labs, created a lot of excitement by managing to do just that. Johnson gives a heroically lucid account of how ''Shor's algorithm'' works, and he also explains why it is potentially dangerous: it could be used to crack the codes that secure electronic communications. These codes rely on the practical impossibility of factoring very large numbers. To break a number with 400 digits down into its constituents, for example, would take the fastest conventional supercomputer billions of years. For a quantum computer programmed with Shor's algorithm, this could be the work of a moment. Destroying our ability to encrypt messages could be the ''killer app'' of quantum computing. But Johnson also describes a new kind of quantum cryptography, related to quantum computing, that would restore the security of communications. So where does that leave us? What benefits would the quantum computer bring? Here it is worth reminding ourselves of something important by saying it together, loudly and slowly: a quantum computer can't do anything that a conventional computer can't do, given enough time. (All right, there is one exception: a quantum computer, unlike a deterministic conventional one, can produce genuinely random numbers.) Its advantage is the speed that arises from parallelism. Johnson gives a clear account of how this speed would allow the quantum computer to handle certain problems that grow very fast in complexity, like factoring large numbers. Yet, he concedes, it looks unlikely that quantum parallelism can breach the complexity class containing the problem of the proverbial traveling salesman (who is looking for the shortest itinerary through a list of cities) and the problem of protein folding in the cell -- let alone the still harder class into which mathematical theorem proving and (probably) chess playing fall. So it is doubtful that the quantum computer will usher in ''a mathematical renaissance.'' Even if it's not about to change the world, quantum computing -- lying at the intersection of physics, mathematics, computability theory and even philosophy -- still has enormous intellectual richness. In this little book, Johnson succeeds in showing us both where it is and how rapidly it's progressing. The man should be arrested for violating Heisenberg's uncertainty principle. Jim Holt writes the Egghead column for Slate.com. http://www.nytimes.com/2003/04/06/books/review/06HOLTLT.html?ex=1050624014&e i=1&en=2532b2cd691516e4 HOW TO ADVERTISE --------------------------------- For information on advertising in e-mail newsletters or other creative advertising opportunities with The New York Times on the Web, please contact onlinesales () nytimes com or visit our online media kit at http://www.nytimes.com/adinfo For general information about NYTimes.com, write to help () nytimes com. Copyright 2003 The New York Times Company ------ End of Forwarded Message ------------------------------------- You are subscribed as interesting-people () lists elistx com To manage your subscription, go to http://v2.listbox.com/member/?listname=ip Archives at: http://www.interesting-people.org/archives/interesting-people/
Current thread:
- 'A Shortcut Through Time': Quantum Weirdness Dave Farber (Apr 06)