Helping computers keep a tight hold on power
AS SCIENTISTS make smaller and faster computers, somewhere down the road they are bound to run into barriers erected by the laws of physics. For instance, the ability of modern computers to wipe out files puts a limit on their power efficiency because deleting material consumes energy. Therefore, no matter to what extent computers are shrunk -- by 2000 AD it may be possible to pack about a billion billion individual circuit functions, called logic gates, into a cubic centimetre -- they may not be able to achieve any desired level of power efficiency simply because they would still dissipate lot of energy to destroy information during computation (Science, Vol 260, No 5106).
That destroying information during computing sets a limit to the efficiency of a computer was realised in the 1970s. IBM's Charles Bennet, who was among the first few to recognise the problem, describes it thus: Imagine two memory elements in your computer. One (say A) is set to zero; the other (B) is set to 3. If A is made equal to B during a computation, the information that A was equal to zero is discarded. But throwing away that information means a waste of energy, and hence loss of efficiency.
There's a way out of this bind, however, at least in theory: make every process in the computer reversible, so that the information that goes into a computation can be recovered. Instead of setting A equal to B, for example, describe A's new value in terms of its old one, say A+B. Then you will have enough information to retrieve the original A, by taking the new A and reversing the operation. Though this solution will still dissipate some energy, the amount will be arbitrarily small, says University of Southern California's William Athas, whose group had proposed this idea 20 years ago.
Last year, Athas and his group used energy-efficient transistors, which are devices that control the flow of a current through different parts of an electrical device, to increase the efficiency of computers. They arranged these transistors into reversible switches, and interspersed among them elements known as inductors, which harvest electrical energy that would have been lost as heat and feed it back into the power supply. The reversible circuits, researchers say, are 7.7 times more energy efficient then conventional ones.
But there's a catch: achieving those gains in efficiency entails a thousand-fold loss in computing speed. The ideas are still in the cradle. "I'm sceptical of whether anyone will use this in the near future," says Athas. But some physicists, like Ralph Merkle of Xerox Corp, are more optimistic. "My rash prediction is that reversible circuits will dominate the 21st century," he says.