Is your gadget using secondhand memory? Predictable senility allows boffins to spot recycled NAND chips

Not what you'd expect in industrial kit


University researchers have developed a new method for rooting out recycled memory chips in industrial control devices.

The group from the University of Alabama, Huntsville say their technique could help vendors spot and remove older flash memory chips that would otherwise jeopardize the stability of embedded devices in the medical, aerospace, and military fields.

They are presenting the research as part of the 2018 HOST Symposium on Tuesday.

supercomputer

Brit uni builds its own supercomputer from secondhand parts

READ MORE

With the embedded device market booming and semiconductor companies hard-pressed to keep up with demand, the re-circulation of older memory chips has grown in recent years. Because chips become more apt to fail as they get older, newer devices that are outfitted with recycled chips will be more likely to experience problems.

"Detection of recycled Flash with high confidence is challenging due to the variability among the different Flash chips caused by process variations," explain suthors Preeti Kumari, Bahar Talukder, Sadman Sakib, Biswajit Ray, and Tauhidur Rahman of the University's electrical and computer engineering school.

"There is very few works on detecting recycled memory chips, and unfortunately, all of them require an extensive database to maintain which is impossible for several electronic systems."

Rather than try to maintain those databases, the researchers instead chose to study the way NAND memory chips age. They found that, as a chip undergoes multiple program erase (PE) cycles (with information being written and then overwritten), small imperfections will be created in the transistors that will very slightly increase the time it takes a chip to perform erase operations.

This timing change, the researchers say, occurs in such a reliable and uniform way that it can be measured across memory chips from different manufacturers.

The study found that, at about 150 PE cycles, memory chips show enough of a reduction in timing cycles to be reliably distinguished from new chips (that's about 3 per cent of the expected 5,000 cycle lifespan).

The group hopes that the techniques could be used by manufacturers to test and weed out the older chips that, in an industrial control device, would cause the entire unit to go down should they fail. In the process, they hope to make embedded and industrial devices more reliable over the long-term. ®

Broader topics


Other stories you might like

  • A peek into Gigabyte's GPU Arm for AI, HPC shops
    High-performance platform choices are going beyond the ubiquitous x86 standard

    Arm-based servers continue to gain momentum with Gigabyte Technology introducing a system based on Ampere's Altra processors paired with Nvidia A100 GPUs, aimed at demanding workloads such as AI training and high-performance compute (HPC) applications.

    The G492-PD0 runs either an Ampere Altra or Altra Max processor, the latter delivering 128 64-bit cores that are compatible with the Armv8.2 architecture.

    It supports 16 DDR4 DIMM slots, which would be enough space for up to 4TB of memory if all slots were filled with 256GB memory modules. The chassis also has space for no fewer than eight Nvidia A100 GPUs, which would make for a costly but very powerful system for those workloads that benefit from GPU acceleration.

    Continue reading
  • GitLab version 15 goes big on visibility and observability
    GitOps fans can take a spin on the free tier for pull-based deployment

    One-stop DevOps shop GitLab has announced version 15 of its platform, hot on the heels of pull-based GitOps turning up on the platform's free tier.

    Version 15.0 marks the arrival of GitLab's next major iteration and attention this time around has turned to visibility and observability – hardly surprising considering the acquisition of OpsTrace as 2021 drew to a close, as well as workflow automation, security and compliance.

    GitLab puts out monthly releases –  hitting 15.1 on June 22 –  and we spoke to the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, about what will be added to version 15 as time goes by. During a chat with the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, The Register was told that this was more where dollars were being invested into the product.

    Continue reading
  • To multicloud, or not: Former PayPal head engineer weighs in
    Not everyone needs it, but those who do need to consider 3 things, says Asim Razzaq

    The push is on to get every enterprise thinking they're missing out on the next big thing if they don't adopt a multicloud strategy.

    That shove in the multicloud direction appears to be working. More than 75 percent of businesses are now using multiple cloud providers, according to Gartner. That includes some big companies, like Boeing, which recently chose to spread its bets across AWS, Google Cloud and Azure as it continues to eliminate old legacy systems. 

    There are plenty of reasons to choose to go with multiple cloud providers, but Asim Razzaq, CEO and founder at cloud cost management company Yotascale, told The Register that choosing whether or not to invest in a multicloud architecture all comes down to three things: How many different compute needs a business has, budget, and the need for redundancy. 

    Continue reading

Biting the hand that feeds IT © 1998–2022