On-Prem

HPC

SK hynix ships blazing fast HBM3E DRAM samples – but most customers have to wait

Everything in 2023 is about AI, which this silicon is said to speed


South Korean chipmaker SK hynix has shipped samples of HBM3E DRAM, claiming it should be able to process 1.15 terabytes of data in a second.

As described on our sister site Blocks and Files, HBM3E is the next generation of the High Bandwidth Memory (HBM) standard overseen by JEDEC. HBM matters because it is faster and uses less power than other forms of memory – such as Double Data Rate (DDR) or Graphics Double Data Rate (GDDR) memory.

Speed and power consumption matter more than ever during the world's current surge of interest in AI – which is why Nvidia recently promised to add HBM3E memory to a forthcoming version of its Grace Hopper superchip.

SK hynix's announcement of sample shipments describes the memory as "the highest-specification DRAM for AI applications currently available."

We get it: nobody wants their AI to crawl along. It's vital to employ 2023's most-discussed tech before investors frown, or competitors pounce.

But SK hynix's announcement asks would-be-buyers to slow down – it will mass produce HBM3E "from the first half of next year." That could mean as late as June – ten months from now. The chipmaker also omitted mention of production volume, so this memory could be even scarcer than GPUs.

But it will be less hot to the touch than previous HBM kit. The silicon slinger claims that its heat dissipation is ten percent better thanks to "Advanced Mass Reflow Molded Underfill" – aka MR-MUF2, a packaging technology the Korean chip champ developed in-house.

One factoid that may produce warm feelings is that HBM3E is backwards compatible with HBM3 – so buyers who splash for the current generation of memory can buy now, safe in the knowledge they have an upgrade path.

SK hynix's announcement includes appreciative quotes from Nvidia, but does not explicitly state the GPU-maker is the customer testing the HBM3E samples.

Our sibling site Blocks and Files has more info on HBME3 and how it's built. ®

Send us news
Post a comment

Former SK hynix chip engineer gets 1.5 years in prison for IP theft

Printed around 4,000 pages of tech before leaving for a job at Huawei

Jensen Huang asked SK hynix to give Nvidia 12-layer HBM4 chips earlier

12-layer HBM3E hardly off the manufacturing line

With record revenue, SK hynix brushes off suggestion of AI chip oversupply

How embarrassing for Samsung

Samsung blames 'one-off costs' as Q3 chip profits plummet 40%

Unexpected expenses in semiconductor division overshadow revenue gains

Samsung releases 24Gb GDDR7 DRAM for testing in beefy AI systems

Production slated for Q1 2025, barring any hiccups

Samsung's HBM3E has been a disaster, but there's a path back

274% profit increase belies missed deadlines, botched launches, and scrambling leadership

SK hynix begins mass production of 36 GB 12-layer HBM3E

Should be Jensen Huang's hands in less than 60 days

Chinese server-maker Inspur claims it's on track for better liquid cooling with 'railway sleeper' design

Beijing is happy at surging sales and production capacity, falling energy requirements

Samsung confirms Exynos 2500 chip as profit skyrockets on back of AI memory boom

Strikes? What strikes?

SK hynix breaks Q1 revenue records on back of AI boom

Memory biz ditches NAND production plans to make more crucial HBM tech

Rising AI tide lifts price of all chips - HBM, natch, but also slower memory and storage

Thank binary brainboxes for helping to inflate PC and Smartphone prices

SK hynix's high bandwidth memory buffet fully booked till 2025

Micron also riding the AI wave with 128 GB DDR5 RDIMMs