A new type of memory that could make computers and smartphones far more energy-efficient, if it ever reaches production, has been developed at the UK's University of Lancaster.
Most computers, phones and data centre boxes today rely on a combination of two main types of memories: DRAM and flash.
As Reg readers know, DRAM is fast but volatile, meaning it loses information when its power source is cut. This makes it ideal for immediately processing data, but useless for long-term storage. That's where flash comes in. Flash is non-volatile in that it retains its contents when powered off, but it's slow, at least compared to DRAM, meaning it's well suited for storage but not for immediate processing.
The Lancaster Uni team's fledgling memory technology, dubbed ULTRARAM, emerged in a paper published last year. It combines the best elements of DRAM and flash to offer, at least in the lab, fast random-access of non-volatile data.
"It's a bit like having your cake and eating it too," said Lancaster physics professor Manus Hayne. "The new memory performs as well as, or better than, DRAM in terms of speed, but is non-volatile, meaning it can store data."
Storage boffins have long hoped to replace or augment traditional memory technology. A number of different solutions are in the works, including MRAM, FeRam, and phase-change memory, such as Intel's Optane.
The Lancaster team's solution, it claimed, uses a quantum mechanical effect called resonant tunnelling, which allows an electron barrier to switch from opaque to transparent by applying a small voltage.
A new paper by the team, published by the IEEE this month, describes more sophisticated simulations of ULTRARAM. The analysis showed that the memory was extremely energy efficient, using only 1 per cent of the energy required for RAM – "the lowest energy consumption of any known memory," said co-author Professor Hayne.
ULTRARAM could change how computers work from head to toe, according to Hayne. Using the new memory in phones and PCs could allow them to be low-cost dumb terminals with minimal local energy consumption, and process everything remotely to maximise efficiency, bandwidth and latency allowing. Hayne's team reckons implementing the new memory in data centres could reduce their energy consumption by a fifth at peak times.
That may not sound like much for now, but it could make a huge difference in the not-so-distant future. The number of objects, from fridges to cars, connected to the web is expected to jump from 27 billion in 2018 to 125 billion in 2030, according to forecasts from IHS Markit. Others may disagree with that figure.
Today, data centres consume about 2 per cent of electricity worldwide. That could rise to 8 per cent of the global total by 2030, according to a study by Anders Andrae, who researches sustainable information and communications technology for Huawei.
Such lofty claims from Lancaster Uni have understandably encountered scepticism from the memory industry at large. Jim Handy, a semiconductor researcher at Objective Analysis, said: "It could be a replacement for DRAM but has the same issue that a lot of alternatives do – it uses more chemical elements than a simple [complementary metal–oxide–semiconductor] process, so the transition from DRAM's standard CMOS to something III-V* will motivate DRAM makers to postpone using it as long as they can."
But Hayne is confident that they have the real deal – even if it is still some way off from becoming commercially viable.
"We're well aware of what we need to solve to make it manufacturable, and we don't see any fundamental problems," he said. "We're moving very fast on this. Within the next decade we'll either have it or we won't."
Patent paperwork has already been submitted in the US, with more on the way. Several companies have expressed commercial interest, according to the university. ®
* ie: using an alloy of compound materials from groups III and V in the periodic table - with these wafers being vastly more expensive to produce than standard silicon dies.