Intel goes back to the future for memory
Intel is placing its bets on a technology it invented in 1970 for next generation memory.
Ovonics Unified Memory (OUM) bears the imprimatur of Gordon Moore, and Intel highlighted a paper he co-authored more than thirty years ago on amorphous semiconductors*.
"Why are we doing it again? I keep getting asked that, even by my managers," said Intel's Stefan Lai, co directory of Intel's California Technology and Manufacturing Group. Chipzilla said that memory technology has really only taken one major leap in the past ten years, with the invention of flash memory.
Laying out the options for a "five to twelve year" framework, of the various technologies currently in the labs, Intel favored polymer memories for data storage and OUM for chip memory.
Amorphous semiconductors are switched between a crystalline or conductive state, and a blobby or resistive state by an electrical current. It's a technique used by rewritable CD-RW discs today.
Lai said breakthroughs in cell physics and more recent process improvements had made arrays of amorphous chips viable.
Intel didn't claim OUM was perfect, but reckoned could be produced more cheaply than the alternatives MRAM favoured by IBM, Motorola and Infineon, and FeRAM, backed most of the old Dramurai (Matsushita, Fujitsu, Toshiba, Hitachi and others).
"MRAM will always be three or four times as expensive," said Lai. "But we’re not saying MRAM is not useful."
"The ease of integration is what attracts us the most. This memory works on less than 3 volts. Current flash memory mixes high voltage and low voltage on the same chip, which is difficult and expensive." ®
* Over a crackly line, we were convinced he'd said "amphibious semiconductors" until we checked our notes. Now that would be something.