A year or so ago, I read in an online Macintosh site that Apple was "looking into quantum computers" as the possible "computers of the future," or words to that effect. I do not know how accurate the news was (in any case, in this rapidly changing industry I am sure it is, by now, old and irrelevant news); but it was characteristic of a certain type of news story that was rather popular in the media at the time, all implying or suggesting that quantum computers might someday become the successors of today's general-purpose, digital, "classical" computers.

In reality, however, the main reason that quantum computers became a very exciting topic a few years ago was that they appeared to have the potential to solve certain problems much faster—exponentially faster, in fact—than conventional, classical computers ever could. What was, perhaps, not sufficiently emphasized in those news stories is that this potential is entirely algorithmic in nature.

The proposed "quantum hardware" was only useful because it could run the new "quantum algorithms"—a special-purpose kind of "software" that was written using a new computational paradigm, involving operations impossible for classical computers, and that could solve some very specific problems in much fewer steps than had hitherto been thought possible. (An important such problem is integer factoring, which would allow one to crack some widely used encryption algorithms.)

Of course, one can always hope that more quantum algorithms to speed up other specific computing tasks will be discovered, and active research continues in this area. But, in the meantime, if you are a computer engineer or manufacturer, what you might want to know is just how good this quantum computer hardware can be at doing all the million other things that conventional computers do today; because, until you actually see a potential advantage, there is going to be no question of ever replacing your classical hardware with quantum hardware. And so, you may want to ask, does quantum hardware really have anything substantial to offer, apart from the possibility of running quantum algorithms, which are still few and far between?

#### Dubious claims

As a matter of fact, there have been some tentative claims in this direction. Quantum computers, one line of thought goes, are, in principle, dissipation-free, reversible machines; so maybe they will offer a way out of the heating problem faced by conventional, dissipative logical circuits, as their speed of operation continues to increase. But this argument is dubious: for one thing, the history of thermodynamics shows that reversible, or near-reversible, machines tend to be more inconvenient to work with than well-designed, irreversible machines that make use of dissipation intelligently.

We have only recently started to estimate the energy costs involved in trying to keep the operation of a quantum computer as dissipation- and irreversibility-free as possible. As an example, the laws of quantum mechanics themselves dictate that a quantum-logical operation performed with the help of a laser requires more than 105 photons if it is to succeed with an error probability smaller than 10^{-5.1}. If these are visible photons, this is an energy cost of almost a million electron volts per logical operation—much worse than the energy cost of today's CMOS gates, which dissipate at most a few hundred eVs at much smaller error rates (see "High-speed emulator raises hopes and questions," above).

#### Fundamental constraints

One can hope, perhaps, for solid-state techniques to offer, eventually, some improvements over these very large figures; but there are fundamental physical constraints that cannot be avoided. Basically, the control systems in a quantum computer have to be, in some sense, "large," because they need to be accurately described by classical mechanics; otherwise they become subjected to the Heisenberg uncertainty principle themselves! M. Ozawa and I independently drew attention to these fundamental constraints last year.^{2, 3}

The energy requirements I derived depend critically on the error threshold for fault-tolerant computation, and on the quantum bits' intrinsic decoherence time; for instance, in a recent article, I estimate that a "chip" containing quantum bits spaced 1 µm apart would require a power per unit area of 3 kW/cm^{2}, if the intrinsic decoherence time was as short as 1 µs.^{4} Clearly, longer decoherence times (which lower the minimum energy requirements dramatically) will be needed for practical solid-state devices.

These estimates, however, are for quantum computers running quantum algorithms. What if one simply wants to run classical algorithms on quantum hardware? Laszlo Kish and I addressed this question in a recent paper.^{5} We found that there is a different scaling of the error rate with the control energy for "classical" hardware, where the error is dominated by thermal fluctuations, and for quantum hardware, where the error is dominated by the uncertainty principle. The classical scaling law is, energetically, much more favorable.

Quantum error-correcting codes can improve the situation in the quantum regime, but this additional layer of complexity results, among other things, in an energy penalty, and we estimate that it would take, at a minimum, 100 times more energy to perform a conventional logical operation on quantum hardware as on classical hardware with a comparable error rate.

Where, then, does all this leave quantum computers? Certainly not as potential replacements for classical computers for general-purpose tasks. If anything, conventional computer designers should know now that it is better—more energetically favorable—to stop the miniaturization race before it really enters the quantum regime.

But quantum computers remain potentially very powerful special-purpose devices, that have stimulated—and will no doubt continue to stimulate—some first-rate research in many important areas of physics, engineering, and computer science. I, for one, would certainly love to see the day when a quantum computer first beats a classical one at factoring a large integer, and fully brings home the power of this new computing paradigm that quantum mechanics has made possible.

#### REFERENCES

- J. Gea-Banacloche, Phys. Rev. A 65, 022308 (2002).
- M. Ozawa, Phys. Rev. Lett. 89, 057902 (2002).
- J. Gea-Banacloche, Phys. Rev. Lett. 89, 217901 (2002).
- J. Gea-Banacloche, Proc. SPIE 5115, 154 (2003).
- J. Gea-Banacloche and L. B. Kish, Fluct. Noise Lett. 3, C3 (2003).

**JULIO GEA-BANACLOCHE** is a professor of physics at the University of Arkansas, 226 Physics Building, Fayetteville, AR 72701; e-mail: [email protected]. He is also an associate editor for Physical Review A in charge of the Quantum Information section. The opinions expressed in this article are strictly the author's own and should not be taken to reflect in any way the editorial policies of Physical Review A or the American Physical Society.

#### High-speed emulator raises hopes and questions

While it is unlikely that quantum computers will ever replace classical computers for general-purpose computing, classical computers have an important role to play in ultimately developing quantum computers, and a Japanese research team has taken what may prove to be an important step in facilitating the process.

The "killer app" for quantum computers of course is the possibility of unlimited code breaking because of their ability to solve problems that are considered "intractable" for classical computers. In such intractable problems, which include factoring large numbers for code breaking as well as solving jigsaw puzzles, the amount of computing time increases exponentially or faster with increases in the quantity of input variables.

Computational solutions to these nondeterministic polynomial (NP) problems that are difficult to solve but easy to check once solved await the unlimited parallelism of quantum-mechanical processes to efficiently execute nondeterministic algorithms by quantitatively evaluating all candidates for any answer simultaneously.

*The quantum information processor chip based on CMOS LSI technology includes of 2048 processing elements, emulates 75 quantum bits of data-handling capacity, and operates about 26 orders of magnitude faster than conventional emulators.*

On the road to actually developing a practical quantum computer, however, one item that might help tremendously would be a classical computing system that could emulate quantum computing efficiently enough to facilitate development of quantum-computing algorithms and software. Minoru Fujishima and his research team at the University of Tokyo (Japan) have proposed such a high-speed emulator concept based on CMOS large-scale-integration (LSI) technology. It is expected to process the quantum algorithm developed by Peter Shor at Bell Labs in 1994, for factoring large numbers and breaking codes, about 26 orders of magnitude faster than conventional emulation methodology (see Figure).^{1}

Fujishima's team implemented the system, dubbed a quantum information processor (QIP), on a field-programmable gate array (FPGA) with 1.5 million gates. It drastically reduces the number of classical bits required for emulating the quantum bits (qubits) used in quantum calculations, by representing quantum states with binary code instead of complex numbers and storing only the location of one of the two possible quantum states instead of the values of all of the possible quantum states. The processor chip contains 2048 or 2^{11} processing elements, each containing 64 quantum-index bits, overall emulating 75 qubits for quantum operation at an 80-MHz clock frequency. "The proposed system will be a powerful tool for the development of quantum algorithms," Fujishima wrote.^{1}

Laszlo Kish at Texas A&M University (College Station, TX), where Fujishima presented his work in September, suggested that focusing more effort on special-purpose computing approaches based on classical CMOS technology— "perhaps inspired by quantum algorithms such as Fujishima's"—might help to develop efficient code-breaking methods more quickly than focusing exclusively on purely "quantum efforts," which are expected to take several more decades before actually producing a quantum computer on the order of 75 qubits.

Of course, even the 26-order-of-magnitude speed improvement with the proposed QIP is infinitely less than would be possible with an actual quantum computer, according to Julio Gea-Banacloche at the University of Arkansas (Fayetteville, AR). "Basically, they have found a way to simplify Shor's algorithm so you need fewer classical bits to emulate the performance of a quantum computer," he says. But even with that capability, classical computers are still "nowhere near being able to factor a number of practical interest."

Gea-Banacloche suggests that the rapid-emulation capabilities of the QIP may well prove useful for decoherence studies, error-correcting studies, and simulating the dynamics of some quantum systems.

#### REFERENCE

- M. Fujishima, K. Inai, T. Kitasho, K. Hoh, Extended Abstracts, 2003 Intl. Conf. on Solid State Devices and Materials, 406 (Tokyo, 2003).