Boffins unwrap bargain-basement processor that talks light and current

It's a terabit-a-second Christmas miracle!

Christmas is a time of miracles, to paraphrase Hans Gruber, and US researchers claim to have pulled off one in silicon photonics: they say they've mainlined super-fast optic communications into a RISC CPU using cheap, bog-standard manufacturing techniques.

Chips using light to shunt data around a processor and to and from its external memory have been promised by Intel, IBM, and others for years now.

What these boffins claim they've come up with is a chip that can be mass produced, uses relatively little power, and uses light to sustain high-bandwidth transfers: the prototype has a bandwidth density of 300 gigabits per second per square millimetre, and needs 1.3W to shift a terabit a second straight from the die to off-chip memory. That means it outpaces today's server chips from Intel.

The brainiacs – hailing from MIT, the University of California, Berkeley, and the University of Colorado – have published a paper in Nature today detailing their 45nm dual-core RISC processor. They say its circuits marry light and electrical current in and outside the package.

"This is a milestone. It's the first processor that can use light to communicate with the external world," said Vladimir Stojanović, professor of electrical engineering and computer sciences at the University of California, Berkeley, who led the development of the chip.

"No other processor has the photonic I/O in the chip."

Miloš Popović, an assistant professor in CU-Boulder’s Department of Electrical, Computer, and Energy Engineering, added: "Light based integrated circuits could lead to radical changes in computing and network chip architecture in applications ranging from smartphones to supercomputers to large data centers, something computer architects have already begun work on in anticipation of the arrival of this technology."

The team designed the 3mm by 6mm chip with 70 million transistors and 850 photonic components. It was tested by fetching instructions, and manipulating data, stored in memory 10 metres away using optic connections.

"We figured out how to reuse the same materials and processing steps that comprise the electrical circuits to build high-performance optical devices in the same chip," said paper co-author Mark Wadefrom CU-Boulder. "This allows us to design complex electronic-photonic systems that can solve the communication bottleneck in computing."

The blueprints were sent off to the GlobalFoundries fab in Fishkill, New York, and built to order. The team has set up two companies to build and sell silicon using the technology for data-center systems.

Chen Sun, a researcher at the Berkeley Wireless Research Center, said that about 20 to 30 per cent of the power consumption of data centers gear is spent shifting data between processor, memory, and networking cards. Shifting to photonics should virtually eliminate that power burden, we're told.

"The advantage with optical is that with the same amount of power, you can go a few centimeters, a few meters or a few kilometers," he said.

"For high-speed electrical links, one metre is about the limit before you need repeaters to regenerate the electrical signal, and that quickly increases the amount of power needed. For an electrical signal to travel one kilometre, you'd need thousands of picojoules for each bit."

Chip builders who had planned on spending the festive season in a food coma will now no doubt be examining the paper and scrutinizing its claims. ®

Other stories you might like

Biting the hand that feeds IT © 1998–2022