Small is big: a cellphone chip that allows monthly battery charge
Loading...
| San Francisco
Gaming fans rushed last November to purchase a new gadget that would let them communicate with computers through body language. The device, called Kinect, "reads" players' gestures through video and infrared sensors. Players can control street-fighting or soccer-playing video game avatars just by moving their arms and shifting their stance.
This kind of communication blows away the joysticks used to play PacMan 30 years ago. You might say that Kinect has achieved a kind of fragmentary approximation of what brains do.
"I see that kind of technology becoming more pervasive and requiring huge amounts of computing power," says Stephen Furber, a computer chip engineer at the University of Manchester, in England.
If less energy-hungry chips are developed, cellphones could fully replace laptop computers. It could allow users to charge those cellphones once a month on a saucer-sized solar panel.
But if Professor Furber is right, the most profound result of more-efficient computers will be increasingly intelligent devices that the average person can afford.
Imagine a computer with a tiny camera mounted on your eyeglasses so that it sees what you see: As you look down the street it reads the name of every restaurant and store; it pulls up information on menus, sales, and specials, and displays it on a tiny liquid-crystal display in your eyeglasses. The computer-vision technology that Kinect uses could, in theory, be used in this kind of augmented reality – or it could allow a bipedal robot to navigate a cluttered house and do chores. But that comes at a price.
"It takes a formidable amount of computing power to process visual scenes," says Furber. Kinect's party trick costs 12 watts. But something smart enough to get around on its own – say, as smart as a squirrel – might need 10,000 watts.
Squirrel-bot's power bill makes him impractical – not to mention the hefty power cord he'd have to drag around. But people are working on more efficient computers that could nudge technology in that direction.
"Reducing the power by a factor of 100 to 1,000 is not impossible," says Eric Pop, a nanotechnologist at the University of Illinois in Urbana-Champaign. "We don't know how to do it today, but it's not impossible."
Professor Pop is working on one possible approach: Fabricate chips with new materials that conduct electricity more effectively. Better conductivity means less electricity is converted into heat – so less is needed to power the chip.
One material, called graphene, consists of a sheet of carbon atoms connected in a hexagonal, chicken-wire pattern. Graphene transistors might consume a tenth to a hundredth the power that current transistors use.
Pop is also investigating a second option for building transistors, called carbon nanotubes, in which the carbon sheet is rolled into a tube a thousandth as wide as a red blood cell. He has built simple chips containing 100 to 1,000 graphene or nanotube transistors. He can induce the nanotubes to emit different wavelengths of light – suggesting that they could also form low-power computer screens or electronic billboards.
In 2007, another group, led by Alex Zettl at the University of California in Berkeley, built a radio consisting of a single nanotube. It hummed out an Eric Clapton tune that it picked up from a small radio transmitter inside the lab.
Those are fun proofs of principle, but the challenge for now is showing reliability. Transistors, the tiny electrical switches that turn on and off more than a billion times per second as a computer runs software, have to behave predictably – or else the computer crashes. When a small university lab can produce a few thousand of these transistors without any duds, says Pop, "we're actually in the realm where people in industry will start taking us more seriously."
Graphene and nanotubes are just two of a number of technologies being explored in order to reduce the power needs of computers.
It's easy to have rosy expectations that these next-generation electronics will hit the shelves soon – after all, silicon electronics have grown explosively for 40 years, with the number of transistors on a chip doubling every couple of years. But looking more closely at that history tells a different story. The first transistor was built in 1947. Yet even simple gizmos like the BusiCom calculator didn't appear until 1971.
In fact, the silicon revolution required having a whole constellation of manufacturing technologies converge at the right time, says John Maltabes, an engineer now at Hewlett-Packard Laboratories, with 30 years of experience using light to etch circuits onto silicon chips. These technologies were necessary for transistors to be produced quickly enough, cheaply enough, and – above all – reliably enough, because one bad transistor out of 2 billion can ruin a whole chip.
Some observers estimate that developing a replacement for today's chips will take 30 years – and cost $100 billion. If these technologies happen, they may first appear in high-end military, medical, or aerospace applications. Government may fund part of their development, just as it did for jet airplanes and supersonic jets decades ago.
From that point, whether post-silicon computers really do enter the average person's life (as the Boeing 747 did) or remain exotic and high-end (as supersonic flight did) will depend on two things, says David Jimenez, a manufacturing analyst at the firm Wright Williams and Kelly in Pleasanton, Calif. "One is, can we build it? And second, if we can build it, what is it going to cost?"
Editor's note: This article is No. 1 of the FutureFocus package "5 Innovations Changing the World".
No. 2 - Human rights: Use satellite 'spy' camera for proof and prevention
No. 3 - Villages leapfrog the grid with biometrics and mobile money
No. 4 - E-fabric spools bring bullet-proof watches, paper-thin batteries
No. 5 - Where solar power can't fly, artificial photosynthesis might