This article is more than 1 year old

IBM's Phase Change Memory computer can tell you if it's raining

Wait! A PCM chip that computes as well as stores?!

IBM boffins have unveiled new work in-memory computing: doing processing inside Phase Change Memory with no external CPU.

Traditional computing requires a memory to hold data and an external processor to which the data is transferred, processed and then written back to memory. This is the Von Neumann architecture and, can be characterised as having a bottleneck between memory and computing.

If some computing could be done inside the memory then that bottleneck goes away and computing would be faster.

But memory has no processor so some aspect of a memory device has to be used, an aspect that changes its nature depending upon the data contents of the memory device. Also the computation is going to be quite primitive

But how?

In a paper published in Nature Communications last week, IBM Zurich researchers showed it is possible to do some computation inside a Phase Change Memory (PCM) device, which they chose because the physics involved at the nanoscale are rich and promising. A PCM device has its cells change their resistance as the internal state of the OCM material, a chalcogenide glass, changes from amorphous to crystalline and back again.

The state change is caused by applying an electrical current, and the binary value is read by measuring the resistance of the cell.

IBM_PCM_Compouter_650

Click the image to rest your eyes

The boffins say "the essential idea is not to treat memory as a passive storage entity, but to exploit the physical attributes of the memory devices to realise computation exactly at the place where the data is stored."

With regard to PCM: "The dynamic evolution of the conductance levels of those devices upon application of electrical signals can be used to perform in-place computing... The conductance of the devices evolves in accordance with the electrical input, and the result of the computation is imprinted in the memory array."

The researchers say that when a high enough electrical current – the RESET pulse – is applied to the PCM's crystalline state or phase, then a significant portion of the material melts through Joule heating and then, when the current is stopped, quenches into the amorphous state. The amount of amorphous material depends upon the amplitude and duration of the RESET pulse and there is a temperature gradient within the cell.

The SET pulse, which changes the cell back to the crystalline state, is sufficient to raise its temperature to cause crystallisation but not high enough to melt the material. The details of the state changes are called crystallisation dynamics. The amount of amorphous material in a cell affects its conductivity.

The boffin bunch devised an algorithm to to detect temporal correlations between event-based data streams, such as the presence/absence of a signal in an IoT device, in a PCM memory device, using an array of one million PCM devices (cells).

The maths involved is, to this mathematical ignoramus, horrendously complicated, involving terms such as "uncentered covariance matrix" and "collective momentum". We won't go there.

What happens is that we envisage a number of randomly distributed binary processes, some correlated, some not. These processes are either binary 1 or zero in successive intervals of a fixed length. We want to know which are correlated and which not, and this can be analysed statistically.

Each process is assigned to a single phase change memory cell. Whenever the process takes the value 1, a SET pulse is applied to the PCM device. The amplitude or the width of the SET pulse is chosen to be proportional to the instantaneous sum of all processes. By monitoring the conductance of the memory devices, we can determine the correlated group.

The statistical correlation can be calculated and stored in the very same PCM cell as the data passes through the memory.

And how would you use it?

The researchers presented a possible application of their work: processing real-world data sets such as weather data. Data can be read in from the weather stations as a time-ordered set of processes (values). If rainfall occurs at a weather station in a one-hour period that has a value 1 and if not, a value zero.

The PCM computing machine devised by the Big Blue boffins could work out where rain was falling by correlating groups of weather station process values over time.

They conclude that, by using such a PCM computational memory module, they could accelerate the task of correlation detection by a factor of 200 relative to an implementation that uses 4 state-of-the-art GPU devices. Plus they chucked in an improvement in energy consumption of two orders of magnitude.

They claim that this co-existence of computation and storage at the nanometer scale could enable ultra-dense, low-power, and massively parallel computing systems. ®

Bootnote: The research was carried out by Abu Sebastian, Tomas Tuma, Nikolaos Papandreou, Manuel Le Gallo, Lukas Kull, Thomas Parnell and Evangelos Eleftheriou at IBM Zurich and published in Nature Communications.

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like