BREAKTHROUGH: Feisty startup slashes chip power by 50%

SuVolta to Intel: 'The future is the $10 chip, not the $200 chip'


An impressively staffed startup by the name of SuVolta has teamed up with Japanese heavyweight Fujitsu Semiconductor to create a new chip-baking technique that promises low-power, inexpensive chips created in a highly scalable process and manufactured using equipment that's already ready in chip foundries.

Yes, that description is a bit of a mouthful – but that's the breadth of promise offered by SuVolta's "deeply depleted channel" (DDC) CMOS transistor tech, as presented on Wednesday by a Fujitsu researcher at the 2011 International Electron Devices Meeting (IEDM) currently underway in Washington DC.

If DDC works as advertised – and from the deep-dive details and test results provided to The Reg by SuVolta's director of device and modeling Lucian Shifren, we have no reason to think that it shouldn't – it could revolutionize how low-power system-on-chip (SoC) silicon is created.

What DDC involves, and what provides its promise, is a new way of creating transistors based on standard bulk planar CMOS manufacturing techniques, and not on more-expensive and specialized transistor manufacturing processes that have cropped up in the past decade or so to allow chip designers to keep pace with that tough taskmaster, Moore's Law.

Shifren emphasized that although DDC could be applied to any level of chippery, SuVolta is currently more concerned with foundry-created, low-power chips, and not the hefty x86 CPUs baked by Intel in their custom fabs. SoCs are where the market is headed, he told us.

"CPUs and GPUs are not actually driving the semiconductor market anymore," Shifren said. "What's driving it are SoCs, and especially those adapted to mobile forms."

The company that's currently driving semiconductor technology, however, is Intel. But Chipzilla has traditionally focused its research on larger, more powerful chips based on the aging IA architecture – although it would dearly love to break into the lucrative mobile market, dominated by non-IA ARM chippery.

"If you look at the industry as a whole right now, who really defines the technology and the technology roadmap?" Shifren asked, rhetorically. "It's Intel." He also gave historical props to IBM, but averred that in recent years Intel has been leading the way.

As an example, Shifren offered Intel's variation of 3D FinFET transistor technology, what Chipzilla calls Tri-Gate. "Have a look at how after Intel made its announcement of FinFET, everybody and their mother came out and said they were going to be working on FinFET," he said.

According to Shifren, however, neither FinFET nor another transistor structure known as fully depleted silicon-on-insulator (FD-SOI), which has been around for nearly a decade, are the best choices for the low-power, low-cost SoC market – although others may disagree when Shifren says "I believe FD-SOI is dead."

SuVolta IEDM presentation slide

SuVolta's DDC transistor shares many advantages of cheap-as-dirt, versatile bulk planar CMOS (click to enlarge)

Not only are both of those techniques expensive when compared with good ol' traditional non-depleted planar bulk CMOS, Shifren says, but they don't scale well for creating a range of chips running at different voltages.

(A bit of background: you'll notice that the term "depleted" is being bandied about a bit. Simply put, a depleted transistor is one in which stray current is minimized in such a way as to prevent power leakage, and to allow the transistor to be activated at a lower voltage. For a fuller explanation, check out an earlier Reg article about Intel's tri-gate transistors.)

Shifren told us that voltage reduction has been given short shrift in the drive to keep up with Moore's Law. "If you look at Moore's Law, and how Moore's Law has been evolving over the last couple of years," he said, "advances in patterning and moving to double patterning, immersion masks, and things like that, the industry has been able to keep along with Moore's Law – as long as you're talking about dimensional scaling.

"But if you have a look at the voltages," he continued, "they haven't scaled nearly as quickly. They actually haven't scaled at all in the last couple of years."

He recited the history of processor voltage scaling, starting a five volts, then down to the low fours, then 3.5V, 2.5V, 1.8V. Then curve of the decreases began to flatten out, now hovering around 1V to 0.7V. Moore's Law, on the other hand, has kept chugging along, doubling transistor density every 18 to 24 months.

SuVolta's breakthrough is that it has managed to slash that voltage requirement essentially in half, a level of improvement that should please any fan of Moore's Law–scale improvements. The company's DDC transistors, used in a proof-of-concept SRAM chip manufactured by Fujitsu, require a mere 0.425V.

Broader topics


Other stories you might like

  • Intel is running rings around AMD and Arm at the edge
    What will it take to loosen the x86 giant's edge stranglehold?

    Analysis Supermicro launched a wave of edge appliances using Intel's newly refreshed Xeon-D processors last week. The launch itself was nothing to write home about, but a thought occurred: with all the hype surrounding the outer reaches of computing that we call the edge, you'd think there would be more competition from chipmakers in this arena.

    So where are all the AMD and Arm-based edge appliances?

    A glance through the catalogs of the major OEMs – Dell, HPE, Lenovo, Inspur, Supermicro – returned plenty of results for AMD servers, but few, if any, validated for edge deployments. In fact, Supermicro was the only one of the five vendors that even offered an AMD-based edge appliance – which used an ageing Epyc processor. Hardly a great showing from AMD. Meanwhile, just one appliance from Inspur used an Arm-based chip from Nvidia.

    Continue reading
  • Intel demands $625m in interest from Europe on overturned antitrust fine
    Chip giant still salty

    Having successfully appealed Europe's €1.06bn ($1.2bn) antitrust fine, Intel now wants €593m ($623.5m) in interest charges.

    In January, after years of contesting the fine, the x86 chip giant finally overturned the penalty, and was told it didn't have to pay up after all. The US tech titan isn't stopping there, however, and now says it is effectively seeking damages for being screwed around by Brussels.

    According to official documents [PDF] published on Monday, Intel has gone to the EU General Court for “payment of compensation and consequential interest for the damage sustained because of the European Commissions refusal to pay Intel default interest."

    Continue reading
  • Semiconductor industry growth to slow in 2022, warns IDC
    Chip price hikes keeping sector healthy but new fabs could lead to 'overcapacity'

    The global economy may be in a tenuous situation right now, but the semiconductor industry is likely to walk away from 2022 with a "healthy" boost in revenues, according to analysts at IDC. But beware oversupply, the analyst firm warns.

    Semiconductor companies across the world are expected to grow collective revenues by 13.7 percent year-on-year to $661 billion, IDC said in research published Wednesday. Global semiconductor revenue last year was $582 billion.

    "Overall, the semiconductor industry remains on track to deliver another healthy year of growth as the super cycle that began in 2020 continues this year," said Mario Morales, IDC group vice president of semiconductors.

    Continue reading
  • Big Tech begs Congress to pass $52bn chip subsidies bill
    This silicon business ain't cheap, you know, say execs at Alphabet, Amazon, Microsoft, Nvidia etc

    Big Tech in America has had enough of Congress' inability to pass pending legislation that includes tens of billions of dollars in subsidies to boost semiconductor manufacturing and R&D in the country.

    In a letter [PDF] sent to Senate and House leaders Wednesday, the CEOs of Alphabet, Amazon, Dell, IBM, Microsoft, Salesforce, VMware, and dozens of other tech and tech-adjacent companies urged the two chambers of Congress to reach consensus on a long-stalled bill they believe will make the US more competitive against China and other countries.

    "The rest of the world is not waiting for the US to act. Our global competitors are investing in their industry, their workers, and their economies, and it is imperative that Congress act to enhance US competitiveness," said the letter.

    Continue reading
  • Intel delivers first discrete Arc desktop GPUs ... in China
    Why not just ship it in Narnia and call it a win?

    Updated Intel has said its first discrete Arc desktop GPUs will, as planned, go on sale this month. But only in China.

    The x86 giant's foray into discrete graphics processors has been difficult. Intel has baked 2D and 3D acceleration into its chipsets for years but watched as AMD and Nvidia swept the market with more powerful discrete GPU cards.

    Intel announced it would offer discrete GPUs of its own in 2018 and promised shipments would start in 2020. But it was not until 2021 that Intel launched the Arc brand for its GPU efforts and promised discrete graphics silicon for desktops and laptops would appear in Q1 2022.

    Continue reading
  • US to help Japan make leading-edge 2nm chips, possibly by 2025
    Player Four has entered the game

    Japan is reportedly hoping to join the ranks of countries producing leading-edge 2nm chips as soon as 2025, and it's working with the US to make such ambitions a reality.

    Nikkei reported Wednesday that businesses from both countries will jointly research the design and manufacturing of such components for devices ranging from smartphones to servers as part of a "bilateral chip technology partnership" between America and Japan.

    The report arrives less than a month after US and Japanese leaders said they would collaborate on next-generation semiconductors as part of broader agreement that also calls for "protecting and promoting critical technologies, including through the use of export controls."

    Continue reading
  • Linux Foundation thinks it can get you interested in smartNICs
    Step one: Make them easier to program

    The Linux Foundation wants to make data processing units (DPUs) easier to deploy, with the launch of the Open Programmable Infrastructure (OPI) project this week.

    The program has already garnered support from several leading chipmakers, systems builders, and software vendors – Nvidia, Intel, Marvell, F5, Keysight, Dell Tech, and Red Hat to name a few – and promises to build an open ecosystem of common software frameworks that can run on any DPU or smartNIC.

    SmartNICs, DPUs, IPUs – whatever you prefer to call them – have been used in cloud and hyperscale datacenters for years now. The devices typically feature onboard networking in a PCIe card form factor and are designed to offload and accelerate I/O-intensive processes and virtualization functions that would otherwise consume valuable host CPU resources.

    Continue reading

Biting the hand that feeds IT © 1998–2022