On-Prem

Storage

SK hynix's high bandwidth memory buffet fully booked till 2025

Micron also riding the AI wave with 128 GB DDR5 RDIMMs


Memory chipmaker SK hynix has already sold all the high bandwidth memory (HBM) it will manufacture this year and most of its expected 2025 production, citing increased demand driven by the AI craze. Micron is also getting in on the act with availability of 128 GB DDR5 RDIMMs for servers.

SK hynix told a news conference at its Icheon headquarters in South Korea that it is set to expand its output of memory chips, predicting that global demand is set to increase over the long term thanks to applications such as AI.

In the near term, the company plans to provide customers with samples of its fifth-generation HBM products – 12-layer HBM3E – during May, and start mass production of these in the third quarter, according to The Korea Times.

Domestic rival Samsung revealed this week that it plans to mass produce HBM3E 12-layer products and a 128 GB product based on 1B nanometer 32 GB DDR5 within the second quarter, also with an eye on demand driven by AI.

SK hynix CEO Kwak Noh-jung told the conference that his company will be able to produce many more HBM chips after completing new fabrication and advanced packaging facilities, located in South Korea and the US, Nikkei Asia reports.

"The HBM market is expected to grow continuously as the number of parameters and modalities are increasing for the improvement of AI quality," Kwak is reported as saying. The company is the main supplier of HBM chips for Nvidia's GPUs, which are heavily used for AI training purposes.

He forecast that average annual demand is set to increase by 60 percent over the mid to long term as the volume of data and the size of the AI models are both continuing to grow.

According to TrendForce, Kwak stated that while AI is currently centered on training models in datacenters, it is expected to rapidly expand to on-device AI applications in smartphones, PCs, vehicles, and other end-user devices in the future. This is why SK hynix believes that demand for memory tailored for AI, characterized by "ultra-fast, high-capacity and low-power," is expected to balloon.

Just last week, the company said it plans to invest more than $14.5 billion in a new memory chip manufacturing plant in North Chungcheong Province. This M15X fab was originally slated to make NAND memory chips, but will now produce DRAM instead, with production expected to start in November 2025.

US-based memory maker Micron is also leaning into the market for AI with availability of 128 GB DDR5 RDIMMs for servers based on its 32 Gb DRAM silicon, for which it claims 16 percent lower latency as well as higher capacity.

These high-speed (up to 5,600 MT/s) memory modules have been engineered to meet the growing needs of datacenter applications such as AI and machine learning (ML), plus high-performance computing (HPC) and in-memory databases, Micron said.

The latest RDIMMs are available now directly from Micron and through select global channel distributors and resellers from June. ®

Send us news
1 Comment

UK's new thinking on AI: Unless it's causing serious bother, you can crack on

Plus: Keep calm and plug Anthropic's Claude into public services

Some workers already let AI do the thinking for them, Microsoft researchers find

Dammit, that was our job here at The Reg. Now if you get a task you don't understand, you may assume AI has the answers

IBM seeks $3.5B in cost savings for 2025, discretionary spend to be clipped

Workforce rebalancing? Yes, but on the plus side, the next 12 months are all about AI, AI, and more AI

UK government insiders say AI datacenters may be a pricey white elephant

Economy-boosting bit barn? Not in my back yard, some locals expected to say

Only 4 percent of jobs rely heavily on AI, with peak use in mid-wage roles

Mid-salary knowledge jobs in tech, media, and education are changing. Folk in physical jobs have less to sweat about

When it comes to AI ROI, IT decision-makers not convinced

Proof of concept projects stuck in pilot phase as investors get itchy feet

Google torpedoes 'no AI for weapons' rules

Will now happily unleash the bots when 'likely overall benefits substantially outweigh the foreseeable risks'

A win at last: Big blow to AI world in training data copyright scrap

You gotta fight ... for your Reuters ... to party

EU plans to 'mobilize' €200B to invest in AI to catch up with US and China

Captain's Log, Stardate 3529.7 – oh yeah, Commish also withdrawing law that would help folks sue over AI harms

Running hot? Server shipments forecast to cool in 2025

Supply chain and regulatory hurdles likely to shrink figures

UK government using AI tools to check up on roadworthy testing centers

Who tests the testers?

Insurance giant finds claims rep that gives a damn (it's AI)

Tech shows customers more humanity than its human staff